logo IMB
Retour

Séminaire de Calcul Scientifique et Modélisation

Swarm-Based Gradient Descent Method for Non-Convex Optimization

Eitan Tadmor

( Fondation Sciences Mathematiques de Paris, LJLL, Sorbonne University and University of Maryland, College Park )

Salle 1

le 13 mai 2024 à 15:30

We discuss a new swarm-based gradient descent (SBGD) method for non-convex optimization. The swarm consists of agents, each is identified with position xx and mass mm. There are three key aspects to the SBGD dynamics: (i) persistent transition of mass from agents at high to lower ground; (ii) a random marching direction, aligned with the steepest gradient descent; and (iii) a time stepping protocol which decreases with mm.

The interplay between positions and masses leads to dynamic distinction between `heavier leaders’ near local minima, and `lighter explorers’ which explore for improved position with large(r) time steps. Convergence analysis and numerical simulations demonstrate the effectiveness of SBGD method as a global optimizer.