Random feature neural networks learn Black-Scholes type PDEs without curse of dimensionality
- 👤 Speaker: Lukas Gonon (Ludwig-Maximilians-Universität München)
- 📅 Date & Time: Thursday 18 November 2021, 15:00 - 15:30
- 📍 Venue: Seminar Room 1, Newton Institute
Abstract
In this talk we consider a supervised learning problem, in which the unknown target function is the solution to a Black-Scholes PDE or a more general exponential-Lévy partial (integro-)differential equation. We analyze the learning performance of random feature neural networks in this context. Random feature neural networks are single-hidden-layer feedforward neural networks in which only the output weights are trainable. This makes training particularly simple, but (a priori) reduces expressivity. Interestingly, this is not the case for Black-Scholes type PDEs, as we show here. We derive bounds for the prediction error of random neural networks for learning sufficiently non-degenerate Black-Scholes type models. A full error analysis addressing all error components is provided and it is shown that the derived bounds do not suffer from the curse of dimensionality. We apply these results to option pricing and validate the bounds numerically.
Series This talk is part of the Isaac Newton Institute Seminar Series series.
Included in Lists
- All CMS events
- bld31
- dh539
- Featured lists
- INI info aggregator
- Isaac Newton Institute Seminar Series
- School of Physical Sciences
- Seminar Room 1, Newton Institute
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)

Lukas Gonon (Ludwig-Maximilians-Universität München)
Thursday 18 November 2021, 15:00-15:30