logo IMB
Retour

Séminaire Images Optimisation et Probabilités

(Maths-IA) Gradient Correlation allows for faster optimization

Julien Hermant

( IMB )

Salle de conférence

le 05 décembre 2024 à 11:15

Many problems, especially in machine learning, can be formulated as optimization problems. Using optimization algorithms, such as stochastic gradient descent or ADAM, has become a cornerstone to solve these optimization problems. However for many practical cases, theoretical proofs of their efficiency are lacking. In particular, it has been empirically observed that adding a momentum mechanism to the stochastic gradient descent often allows solving these optimization problems more efficiently. In this talk, we introduce a condition linked to a measure of the gradient correlation that allows to theoretically characterize the possibility to observe this acceleration.