BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Gradient flows and randomised thresholding: sparse inversion and c
 lassification - Jonas Latz (Heriot-Watt University)
DTSTART:20230616T100000Z
DTEND:20230616T110000Z
UID:TALK202288@talks.cam.ac.uk
DESCRIPTION:Sparse inversion and classification problems are ubiquitous in
  modern data science and imaging. They are often formulated as non-smooth&
 nbsp\;minimisation problems. In sparse inversion\, we minimise\, e.g.\, th
 e sum of a data fidelity term and an L1/LASSO regulariser. In classificati
 on\, we consider\,&nbsp\;e.g.\, the sum of a data fidelity term and a non-
 smooth Ginzburg&ndash\;Landau energy. Standard (sub)gradient descent metho
 ds have shown to be inefficient&nbsp\;when approaching such problems. Spli
 tting techniques are much more useful: here\, the target function is parti
 tioned into a sum of two subtarget&nbsp\;functions&mdash\;each of which ca
 n be efficiently optimised. Splitting proceeds by performing optimisation 
 steps alternately with respect to each of the two&nbsp\;subtarget function
 s. In this work\, we study splitting from a stochastic continuous-time per
 spective. Indeed\, we define a differential inclusion that&nbsp\;follows o
 ne of the two subtarget function's negative subdifferential at each point 
 in time. The choice of the subtarget function is controlled by a binary&nb
 sp\;continuous-time Markov process. The resulting dynamical system is a st
 ochastic approximation of the underlying subgradient flow. We investigate 
 this&nbsp\;stochastic approximation for an L1-regularised sparse inversion
  flow and for a discrete Allen&ndash\;Cahn equation minimising a Ginzburg&
 ndash\;Landau energy. In&nbsp\;both cases\, we study the longtime behaviou
 r of the stochastic dynamical system and its ability to approximate the un
 derlying subgradient flow at any&nbsp\;accuracy. We illustrate our theoret
 ical findings in a simple sparse estimation problem and also in low- and h
 igh-dimensional classification problems.
LOCATION:Seminar Room 2\, Newton Institute
END:VEVENT
END:VCALENDAR
