Title: Micro- and macroscopic neural architectures for data-efficient learning
Naoki Hiratani, Gatsby Computational Neuroscience Unit, University College London
Humans and animals are capable of rapid learning from a limited experience, which is still difficult for artificial neural networks, as its underlying mechanisms remain elusive. In this talk, I propose that micro- and macroscopic architectures of neural circuits are the keys to explain this data-efficient learning. On the microscopic architecture, recent connectomics studies revealed that, in cortical microcircuits, the majority of neuron-to-neuron connections are realized by multiple synapses. Here I show that with multisynaptic connections, synaptic plasticity approximates a sample-based Bayesian filtering algorithm known as particle filtering, and wiring plasticity implements its resampling process . The proposed plasticity rule enables a near-optimal learning while reproducing various known properties of dendritic plasticity and synaptic organization. Regarding the macroscopic architecture, here I focus on the olfactory systems, and show that the number of cells in the olfactory circuits might be evolutionary optimized for data-efficient learning. Among several species of mammals, the number of layer 2 neurons in piriform cortex is proportional to the number of glomeruli to the 3/2 power (Srinivasan & Stevens, bioRxiv, 2018), while in insects, the number of Kenyon cells shows roughly cubic scaling in the number of glomeruli . Modeling the olfactory system as a three-layered nonlinear neural network, I analytically derive the network size that optimizes, over the lifetime of the animal, the ability to predict rewards associated with odors. Although having more neurons increases the information capacity, having too many neurons makes developmental tuning of synaptic weights difficult due to overfitting. Applying this tradeoff, the optimal population sizes robustly follow the scaling law observed in mammals. Moreover, when a fraction of the olfactory circuit can be genetically specified, not developmentally learned, the scaling becomes steeper, as is observed among insects. In summary, our results suggest that elaborate neural architectures at various scales underlie an unparalleled learning ability of the brain.
 Hiratani N & Fukai T (2018) Redundancy in synaptic connections enables neurons to learn optimally. Proceedings of the National Academy of Sciences, 115 (29), E6871-E6879.
 Hiratani N & Latham PE, Developmental and evolutionary principles of olfactory circuit designs, Bernstein conference 2019, contributed talk.