Please join us for a CCN Seminar, "Chalk Talk with Peter Latham," Professor of Theoretical Neuroscience, Gatsby Computational Neuroscience Unit, UCL
Title: Failures, Dale's Law, and learning in the brain
Abstract: In overparameterized systems (like the brain), both architecture and initial weights are important for efficient learning. Here we focus on the latter, and ask: given synaptic failures -- a salient feature of the brain -- how should the initial weights scale with n, the number of connections per neuron? We show that synaptic failures imply that both the mean and the standard deviation of the weights should be initialized to order(1/n) -- very different from the order(1/sqrt(n)) scaling used in deep networks. More speculatively, the fact that the mean weights are nonzero may provide justification for Dale's law, and it also may solve the credit assignment problem during learning.
To schedule a meeting with Peter during his visit, please be in touch with Jessica Hauser at jhauser@flatironinstitute.org.