logo IMB
Retour

Séminaire Images Optimisation et Probabilités

On the differential properties of WGAN-like problems

Arthur Leclaire

( IMB )

Salle de Conférences

le 15 avril 2021 à 11:00

The problem of WGAN (Wasserstein Generative Adversarial Network) learning is an instance of optimization problems where one wishes to find, among a parametric class of distributions, the one which is closest to a target distribution in terms of optimal transport (OT) distance. Applying a gradient-based algorithm for this problem requires to express the gradient of the OT distance with respect to one of its argument, which is related to the solutions of the dual problem (Kantorovich potentials). The first part of this talk aims at finding conditions that ensure the existence of such gradient. After discussing regularity issues that may appear with discrete target measures, we will show that regularity problems are avoided when using entropy-regularized OT. In the second part, we will see how these gradients can be exploited in a stable way to address some imaging problems where the target discrete measure is reasonably large. In particular, using OT distances between multi-scale patch distributions, this allows to estimate a generative convolutional network that can synthesize an exemplar texture in a faithful and very efficient way. This is a joint work with Antoine Houdard, Nicolas Papadakis and Julien Rabin.