Double-descent curves in neural networks: A new perspective using Gaussian processes.
- 👤 Speaker: Ouns El Harzli
- 📅 Date & Time: Tuesday 01 June 2021, 15:00 - 16:00
- 📍 Venue: https://cl-cam-ac-uk.zoom.us/j/96785077311?pwd=akljc1JQVS81R1FxelZCMUdQR3I2dz09
Abstract
Double-descent curves in neural networks describe the phenomenon that the generalisation error initially descends with increasing parameters, then grows after reaching an optimal number of parameters which is less than the number of data points, but then descends again in the overparameterised regime. Here we use a neural network Gaussian process (NNGP) which maps exactly to a fully connected network (FCN) in the infinite width limit, combined with techniques from random matrix theory, to calculate this generalisation behaviour, with a particular focus on the overparameterised regime. We verify our predictions with numerical simulations of the corresponding Gaussian process regressions. An advantage of our NNGP approach is that the analytical calculations are easier to interpret. We argue that neural network generalization performance improves in the overparameterised regime precisely because that is where they converge to their equivalent Gaussian process.
Series This talk is part of the ML@CL Seminar Series series.
Included in Lists
- Hanchen DaDaDash
- https://cl-cam-ac-uk.zoom.us/j/96785077311?pwd=akljc1JQVS81R1FxelZCMUdQR3I2dz09
- ML@CL Seminar Series
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)

Ouns El Harzli
Tuesday 01 June 2021, 15:00-16:00