Retour Séminaire Images Optimisation et Probabilités
Some statistical insights into entropy regularized Wasserstein estimators, through weights estimation in a mixture model
Salle de Conférences
le 12 mai 2022 à 11:00
In 2013, Marco Cuturi introduced an entropic regularized version of the Wasserstein distance. Due to its computational advantages, this regular- ized version of the Wasserstein distance is now a popular tool in statistics to compare probability distributions, or point clouds. In 2017, Arjovsky et al. proposed with Wasserstein-GANs, to minimize the Wasserstein dis- tance among a class of parameterized distributions, and an empirical prob- ability distribution; this is an example of Wasserstein estimation method. In this talk, I will discuss the use of the regularized Wasserstein distance to perform Wasserstein estimation. Motivated by a bio-statistical appli- cation, we propose to find among mixture distributions parameterized by their weights, the closest to an empirical probability distribution with re- spect to the regularized Wasserstein distance. Through this example of Wasserstein estimator, I will discuss the influence of the regularization parameter on the statistical properties of Wasserstein estimators. It is a joint work with Jérémie Bigot, Boris Hejblum and Arthur Leclaire.