logo IMB
Retour

Séminaire Images Optimisation et Probabilités

On the SAGA algorithm with decreasing step

Thierry Emeric Gbaguidi

( IMB U-Bordeaux. )

Salle de conférences

le 21 novembre 2024 à 11:15

Stochastic optimization naturally appear in many application areas, including machine learning. Our goal is to go further in the analysis of the Stochastic Average Gradient Accelerated (SAGA) algorithm. To achieve this, we introduce a new λ\lambda-SAGA algorithm which interpolates between the Stochastic Gradient Descent (λ=0\lambda=0) and the SAGA algorithm (λ=1\lambda=1). Firstly, we investigate the almost sure convergence of this new algorithm with decreasing step which allows us to avoid the restrictive strong convexity and Lipschitz gradient hypotheses associated to the objective function. Secondly, we establish a central limit theorem for the λ\lambda-SAGA algorithm. Finally, we provide the non-asymptotic LpL^p rates of convergence.