Calculus of Variations and Geometric Measure Theory

A. Basteri - D. Trevisan

Quantitative Gaussian Approximation of Randomly Initialized Deep Neural Networks

created by trevisan on 05 Jul 2024

[BibTeX]

preprint

Inserted: 5 jul 2024

Year: 2022

ArXiv: 2203.07379 PDF

Abstract:

Given any deep fully connected neural network, initialized with random Gaussian parameters, we bound from above the quadratic Wasserstein distance between its output distribution and a suitable Gaussian process. Our explicit inequalities indicate how the hidden and output layers sizes affect the Gaussian behaviour of the network and quantitatively recover the distributional convergence results in the wide limit, i.e., if all the hidden layers sizes become large.