Calculus of Variations and Geometric Measure Theory

L. Chizat - M. Colombo - X. Fernández-Real - A. Figalli

Infinite-width limit of deep linear neural networks

created by figalli on 08 Aug 2024
modified on 19 Aug 2024

[BibTeX]

Accepted Paper

Inserted: 8 aug 2024
Last Updated: 19 aug 2024

Journal: Comm. Pure Appl. Math.
Year: 2024

Abstract:

This paper studies the infinite-width limit of deep linear neural networks initialized with random parameters. We obtain that, when the number of neurons diverges, the training dynamics converge (in a precise sense) to the dynamics obtained from a gradient descent on an infinitely wide deterministic linear neural network. Moreover, even if the weights remain random, we obtain their precise law along the training dynamics and prove a quantitative convergence result of the linear predictor in terms of the number of neurons.

We finally study the continuous-time limit obtained for infinitely wide linear neural networks and show that the linear predictors of the neural network converge at an exponential rate to the minimal \( \ell^2 \)-norm minimizer of the risk.


Download: