logo IMB
Retour

Séminaire Images Optimisation et Probabilités

Free Probability, Newton lilypads and hyperbolicity of Jacobians as a solution to the problem of tuning the architecture of neural networks

Reda Chhaibi

Salle de Conférences

le 10 mars 2022 à 11:00

Gradient descent during the learning process of a neural network can be subject to many instabilities. The spectral density of the Jacobian is a key component for analyzing robustness. Following the works of Pennington et al., such Jacobians are modeled using free multiplicative convolutions from Free Probability Theory (FPT). We present a reliable and very fast method for computing the associated spectral densities. This method has a controlled and proven convergence. Our technique is based on an homotopy method: it is an adaptative Newton-Raphson scheme which chains basins of attraction. We find contiguous lilypad-like basins and step from one to the next, heading towards the objective. In order to demonstrate the applicability of our method we show that the relevant FPT metrics computed before training are highly correlated to final test losses – up to 85%. We also give evidence that a very desirable feature for neural networks is the hyperbolicity of their Jacobian at initialization.