BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Quantifying Sensitivity to Model Uncertainty - Jan  Obłój (Unive
 rsity of Oxford)
DTSTART:20251113T094000Z
DTEND:20251113T102000Z
UID:TALK238555@talks.cam.ac.uk
DESCRIPTION:Stochastic optimization is ubiquitous across applied sciences.
  It can refer to a portfolio allocation problem\, an optimised certainty e
 quivalent or a risk measure computation\, a standard regression or a deep 
 learning problem\, an image recognition task or a stochastic chemical kine
 tics model. At the heart of the optimisation is a probability measure\, or
  a model\, which describes the system. It could come from data\, simulatio
 n or a modelling effort but there is always a degree of uncertainty about 
 it. Wasserstein Distributionally Robust Optimization acknowledges this unc
 ertainty by considering a nonparametric Wasserstein ball around the postul
 ated model. However\, such problems\, whilst appealingconceptually are oft
 en very hard to solve. In this talk I will discuss a series of works in wh
 ich we develop sensitivity analysis with respect to the degree of model un
 certainty. This offers a non-parametric sensitivity (or a &ldquo\;Greek&rd
 quo\;) to model uncertainty. I will highlight applications from decision t
 heory\, through mathematical finance to machine learning and statistics. I
  will start with simple one-step models which use classical Wasserstein di
 stances and provide explicit formulae for the first order correction to bo
 th the value function and the optimizer\, and further extend the results t
 o optimization under linear constraints. I will then cover dynamic setting
 s in which model neighbourhoods are considered in causal Wasserstein sense
 . I will discuss both discrete and continuous time results.Talk based on j
 oint works with Daniel Bartl\, Samuel Drapeau\, Yifan Jiang and Johannes W
 iesel
LOCATION:Seminar Room 1\, Newton Institute
END:VEVENT
END:VCALENDAR
