Retour Séminaire Images Optimisation et Probabilités
Stochastic Optimization and texture synthesis
Valentin de Bortoli
( CMLA, ENS Paris-Saclay ) Salle de Conférences
le 28 mars 2019 à 11:00
In this talk, I will present theoretical tools to assert the convergence of a gradient-based scheme, the SOUK algorithm (Stochastic Optimization with Unajdusted Kernel). We assume that the gradient of the function to optimize at point x can be written as the expectation of some function with respect to a probability distribution which might depend on x. Such functionals naturally appear in the Empirical Bayes framework, which aims at computing the hyperparameters of a statistical model using only the observations. I will present the key elements of our proof of convergence, which borrows from the litterature on stochastic optimization and Markov chain theory on general state spaces. Using recent work on unadjusted Langevin based dynamics we are able to provide convergence results in the context of parametric examplar-based texture synthesis. I will present visual results and discuss how SOUK compares to the state-of-the art algorithms for texture synthesis.