# Random sections of ellipsoids and the power of random information

ASCW01 - Challenges in optimal recovery and hyperbolic cross approximation

We study the circumradius of the intersection of an $m$-dimensional ellipsoid$\mathcal E$ with half axes $\sigma_1\geq\dots\geq \sigma_m$ with random subspaces of codimension $n$. We find that, under certain assumptions on $\sigma$, this random radius $\mathcal{R}n=\mathcal{R}_n(\sigma)$ is of the same order as the minimal such radius $\sigma{n+1}$ with high probability. In other situations $\mathcal{R}_n$ is close to the maximum$\sigma_1$. The random variable $\mathcal{R}_n$ naturally corresponds to the worst-case error of the best algorithm based on random information for $L_2$-approximation of functions from a compactly embedded Hilbert space $H$ with unit ball $\mathcal E$.

In particular, $\sigma_k$ is the $k$th largest singular value of the embedding $H\hookrightarrow L_2$. In this formulation, one can also consider the case $m=\infty$, and we prove that random information behaves very differently depending on whether $\sigma \in \ell_2$ or not. For $\sigma \notin \ell_2$ random information is completely useless. For $\sigma \in \ell_2$ the expected radius of random information tends to zero at least at rate $o(1/\sqrt{n})$ as $n\to\infty$.

In the proofs we use a comparison result for Gaussian processes a la Gordon, exponential estimates for sums of chi-squared random variables, and estimates for the extreme singular values of (structured) Gaussian random matrices.

This is joint work with David Krieg, Erich Novak, Joscha Prochno and Mario Ullrich.

This talk is part of the Isaac Newton Institute Seminar Series series.