logo IMB
Retour

Séminaire Images Optimisation et Probabilités

Practical acceleration for some optimization methods using relaxation and inertia

Franck Iutzeler

( Université de Grenoble )

Salle de Conférences

le 23 juin 2016 à 11:00

Optimization algorithms can often be seen as fixed-points iterations of some operators. To accelerate such iterations, two simple methods that can be used are i) relaxation (simple combination of current and previous iterate) and ii) inertia (slightly more involved modification made popular by Nesterov's acceleration). These methods have been celebrated for accelerating linearly and sub-linearly converging algorithms such as gradient methods, proximal gradient (FISTA), or ADMM (Fast ADMM). In this presentation, we build upon generic contraction properties and affine approximations to propose generic auto-tuned acceleration methods and illustrate their compared interests on various optimization algorithms.