The Pervasive Role of Composing Transformations in Machine Learning
- 👤 Speaker: Anders Karlsson (Université de Genève)
- 📅 Date & Time: Tuesday 22 July 2025, 14:50 - 15:40
- 📍 Venue: External
Abstract
From the layer maps of neural networks to training procedures and reinforcement learning, compositions of transformations permeate modern AI. These compositional products often involve randomly selected maps, as in weight initialization, stochastic gradient descent (SGD), and dropout. In reinforcement learning, Bellman-type operators with randomness are iterated to update reward structures and strategies. I will discuss the mathematics and geometry underlying the composition of random transformations. In particular, I will explain a general limit law established in joint work with Gouëzel. Moreover, I will discuss a possible cut-off phenomenon related to the depth of neural networks and the influence of iteration order. Motivated by these observations, and in collaboration with Avelin, Dherin, Gonzalvo, Mazzawi, and Munn, we propose backward variants of SGD that improve stability and convergence while maintaining generalisation.
Series This talk is part of the Isaac Newton Institute Seminar Series series.
Included in Lists
- All CMS events
- bld31
- dh539
- External
- Featured lists
- INI info aggregator
- Isaac Newton Institute Seminar Series
- School of Physical Sciences
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)

Anders Karlsson (Université de Genève)
Tuesday 22 July 2025, 14:50-15:40