Top-level heading

Quantitative Gaussian approximations and higher-order corrections for wide neural networks

Categoria
Altro (categoria non censita)
Categoria non censita
DocTorV seminars
Data e ora inizio evento
Data e ora fine evento
Sede

Dipartimento di Matematica, Università di Roma Tor Vergata

Aula esterna
Aula D'Antoni
Speaker
Lucia Celli
This talk presents quantitative central limit theorems for fully connected neural networks with both Gaussian and non-Gaussian initialization, when the network width tends to infinity and the network is evaluated on a finite collection of inputs. I will discuss explicit convergence rates in several probabilistic metrics (including total variation, Wasserstein, and convex distances), as well as extensions to regimes where depth and width grow simultaneously. In the final part, we will discuss multidimensional Edgeworth expansions of arbitrary order for neural network outputs with Gaussian initialization. These expansions allow one to approximate the law of finite-width networks while explicitly capturing their non-Gaussian deviations through higher-order cumulants, yielding sharp upper and matching lower bounds in total variation distance. We will also discuss applications to Bayesian neural networks, including quantitative bounds on the error introduced when replacing the true prior or posterior distributions with their Gaussian or Edgeworth approximations.
Contatti/Organizzatori
doctorv.uniroma2@gmail.com