COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |

University of Cambridge > Talks.cam > Signal Processing and Communications Lab Seminars > Lie Group Machine Learning and Natural Gradient from Information Geometry

## Lie Group Machine Learning and Natural Gradient from Information GeometryAdd to your list(s) Download to your calendar using vCal - Dr Frederic Barbaresco, THALES Land and Air Systems
- Wednesday 04 December 2019, 14:00-15:00
- LT6, Baker Building, CUED.
If you have a question about this talk, please contact Dr Ramji Venkataramanan. The classical simple gradient descent used in Deep Learning has two drawbacks: the use of the same non-adaptive learning rate for all parameter components, and a non-invariance with respect to parameter re-encoding inducing different learning rates. As the parameter space of multilayer networks forms a Riemannian space equipped with Fisher information metric, instead of the usual gradient descent method, the natural gradient or Riemannian gradient method, which takes account of the geometric structure of the Riemannian space, is more effective for learning. The natural gradient preserves this invariance to be insensitive to the characteristic scale of each parameter direction. The Fisher metric defines a Riemannian metric as the Hessian of two dual potential functions (the Entropy and the Massieu Characteristic Function). In Souriau’s Lie groups thermodynamics, the invariance by re-parameterization in information geometry has been replaced by invariance with respect to the action of the group. In Souriau model, under the action of the group, the entropy and the Fisher metric are invariant. Souriau defined a Gibbs density that is covariant under the action of the group. The study of exponential densities invariant by a group goes back to the work of Muriel Casalis in her 1990 thesis. The general problem was solved for Lie groups by Jean-Marie Souriau in Geometric Mechanics in 1969, by defining a “Lie groups Thermodynamics” in Statistical Mechanics. These new tools are bedrocks for Lie Group Machine Learning. Souriau introduced a Riemannian metric, linked to a generalization of the Fisher metric for homogeneous Symplectic manifolds. This model considers the KKS 2 -form (Kostant-Kirillov-Souriau) defined on the coadjoint orbits of the Lie group in the non-null cohomology case, with the introduction of a Symplectic cocycle, called “Souriau’s cocycle”, characterizing the non-equivariance of the coadjoint action (action of the Lie group on the moment map). We will introduce the link between Souriau “Lie Groups Thermodynamics”, Information Geometry and Kirillov representation theory to define probability densities as Souriau covariant Gibbs densities (density of Maximum of Entropy). We will illustrate this case for the matrix Lie group SU (1,1) (case with null cohomology), and the one for the matrix Lie group SE(3) (case with non-null cohomology), through the computation of Souriau’s moment map, and Kirillov’s orbit method.
This talk is part of the Signal Processing and Communications Lab Seminars series. ## This talk is included in these lists:- All Talks (aka the CURE list)
- Cambridge Centre for Data-Driven Discovery (C2D3)
- Cambridge University Engineering Department Talks
- Centre for Smart Infrastructure & Construction
- Chris Davis' list
- Featured lists
- Information Engineering Division seminar list
- Interested Talks
- LT6, Baker Building, CUED
- School of Technology
- Signal Processing and Communications Lab Seminars
- Trust & Technology Initiative - interesting events
- bld31
- ndk22's list
- rp587
Note that ex-directory lists are not shown. |
## Other listsCambridge Linguistics Forum political thought Chemical Engineering and Biotechnology Departmental Seminars## Other talksGeneralised Knight Tours How to give great presentations, when you hate giving presentations Tracking trade in underground symbioses Seeing the invisible; the Dark Matter puzzle. Nonlinear Waves in Granular Crystals: Modeling, Analysis, Computations and Experiments |