logo IMB
Retour

Séminaire Images Optimisation et Probabilités

(Maths-IA) Rescaling Symmetries in Neural Networks: a Path-lifting Perspective

Rémi Gribonval

( INRIA )

Salle de conférences

le 30 janvier 2025 à 10:15

ReLU neural networks parameterizations are well-known to satisfy rescaling symmetries, which arise due to the homogeneity of the ReLU activation function. Ignoring such symmetries can lead to inefficient algorithms, non-informative theoretical bounds, and irrelevant interpretations of parameters. Can such symmetries be harnessed, theoretically and practically ? This is the goal of the path-lifting, a rescaling-invariant polynomial representation of the parameters of ReLU networks and their modern variants with max-pooling and skip connections.

Despite its combinatorial dimension, the path-lifting yields easily computable quantities that reveal useful properties of the corresponding functions, from Lipschitz regularity to convexity or statistical generalization bounds .... Besides introducing the general concept of path-lifting from basic examples and highlighting its key mathematical and computational properties, the talk will quickly tour some of its applications such as network pruning with guarantees.


Primarily based on joint work with A. Gonon, N. Brisebarre, E. Riccietti (https://hal.science/hal-04225201v5, https://hal.science/hal-04584311v3)

and with A. Gagneux, M. Massias, E. Soubies (https://hal.science/hal-04877619v1)