Learning and retaining tasks in redundant brain circuits
Add to your list(s)
Download to your calendar using vCal
If you have a question about this talk, please contact Alberto Padoan.
Neuronal networks have many tunable parameters such as synaptic strengths that are shaped during learning of a task. The number of degrees of freedom for representing a task can vastly exceed the minimum required for good performance. I will describe recent work that explores the consequences of such additional ‘redundant’ degrees of freedom for learning and for task representation in animals. We find that additional redundancy in network parameters can make a fixed task easier to learn and compensate for deficiencies in learning rules. However, we also find that in a biologically relevant setting where synapses are subject to unavoidable noise there is an upper limit to the level of useful redundancy in a network, suggesting an optimal network size for a given task.
This talk is part of the CUED Control Group Seminars series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
|