University of Cambridge > Talks.cam > Computational Neuroscience > The unbreakable lightness of single neuron non-linearities in learning

The unbreakable lightness of single neuron non-linearities in learning

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Samuel Eckmann.

A large body of work in the theory of neural networks (artificial or biological) has been performed on neural networks comprised of simple activation functions, prominently, binary units. Analysing such networks has led to some general conclusions. For instance, there is long held consensus that local biological learning mechanisms such as Hebbian learning are very inefficient compared to iterative non-local learning rules used in machine learning. In this talk, I will show that when it comes to memory operations such a conclusion is an artefact of analysing networks of binary neurons: when neurons with graded response, more reminiscent of the response of real neurons, are considered, memory storage in neural networks with Hebbian learning can be very efficient and close to the optimal performance. Turning to artificial neural networks, I will discuss how non-linearities influence the ability of Restricted Boltzmann Machines to express probability distributions over the visible nodes and how this affects learnability in these machines.

Refs: Schönsberg, F., Roudi, Y., & Treves, A. (2021). Efficiency of local learning rules in threshold-linear associative networks. Physical Review Letters, 126(1), 018301. Bulso, N., & Roudi, Y. (2021). Restricted boltzmann machines as models of interacting variables. Neural Computation, 33(10), 2646-2681.

This talk is part of the Computational Neuroscience series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity