Responsable : Luis Fredes et Camille Male
Neural style transfer (NST) is a deep learning technique that produces an unprecedentedly rich style transfer from a style image to a content image. It is particularly impressive when it comes to transferring style from a painting to an image. NST was originally achieved by solving an optimization problem to match the global statistics of the style image while preserving the local geometric features of the content image. The two main drawbacks of this original approach is that it is computationally expensive and that the resolution of the output images is limited by high GPU memory requirements. Many solutions have been proposed to both accelerate NST and produce images with larger size. However, our investigation shows that these accelerated methods all compromise the quality of the produced images in the context of painting style transfer. Indeed, transferring the style of a painting is a complex task involving features at different scales, from the color palette and compositional style to the fine brushstrokes and texture of the canvas. This paper provides a solution to solve the original global optimization for ultra-high resolution (UHR) images, enabling multiscale NST at unprecedented image sizes. This is achieved by spatially localizing the computation of each forward and backward passes through the VGG network. Extensive qualitative and quantitative comparisons, as well as a user study, show that our method produces style transfer of unmatched quality for such high-resolution painting styles. By a careful comparison, we show that state of the art fast methods are still prone to artifacts, thus suggesting that fast painting style transfer remains an open problem.
Joint work with Lara Raad, José Lezama and Jean-Michel Morel.
In this talk, I will present the results of a collaboration with Benjamin McKenna on the injective norm of large random Gaussian tensors and uniform random quantum states and, time allowing, describe some of the context underlying this work. The injective norm is a natural generalization to tensors of the operator norm of a matrix and appears in multiple fields. In quantum information, the injective norm is one important measure of genuine multipartite entanglement of quantum states, known as geometric entanglement. In our recent preprint, we provide high-probability upper bounds in different regimes on the injective norm of real and complex Gaussian random tensors, which corresponds to lower bounds on the geometric entanglement of random quantum states, and to bounds on the ground-state energy of a particular multispecies spherical spin glass model. Our result represents a first step towards solving an important question in quantum information that has been part of folklore.
Optimal Transport is a useful metric to compare probability distributions and to compute a pairing given a ground cost. Its entropic regularization variant (eOT) is crucial to have fast algorithms and reflect fuzzy/noisy matchings. This work focuses on Inverse Optimal Transport (iOT), the problem of inferring the ground cost from samples drawn from a coupling that solves an eOT problem. It is a relevant problem that can be used to infer unobserved/missing links, and to obtain meaningful information about the structure of the ground cost yielding the pairing. On one side, iOT benefits from convexity, but on the other side, being ill-posed, it requires regularization to handle the sampling noise. This work presents an in-depth theoretical study of the l1 regularization to model for instance Euclidean costs with sparse interactions between features. Specifically, we derive a sufficient condition for the robust recovery of the sparsity of the ground cost that can be seen as a far reaching generalization of the Lasso's celebrated Irrepresentability Condition. To provide additional insight into this condition, we work out in detail the Gaussian case. We show that as the entropic penalty varies, the iOT problem interpolates between a graphical Lasso and a classical Lasso, thereby stablishing a connection between iOT and graph estimation, an important problem in ML.
Page de l'événement : https://indico.math.cnrs.fr/event/11353/overview
Wednesday 10/07
14h00 Jurgen Angst (Univ. Rennes)
Title : TLC in total variation for beta-ensembles
Résumé : In this talk, we study the fluctuations of linear statistics associated with beta-ensembles, which are statistical physics models generalizing random matrix spectra. In the context of random matrices precisely (e.g. GOE, GUE), the "law of large numbers" is Wigner's theorem, which states that the empirical measure of eigenvalues converges to the semicircle law, and fluctuations around equilibrium can be shown to be Gaussian. We will describe how this result generalizes to beta-ensembles and how it is possible to quantify the speed of convergence to the normal distribution. We obtain optimal rates of convergence for the total variation distance and the Wasserstein distances. To do this, we introduce a variant of Stein's method for a generator $L$ that is not necessarily invertible, and which allows us to establish the asymptotic normality of observables that are not in the image of $L$. Time permitting, we will also look at the phenomenon of super-convergence, which ensures that convergence to the normal law takes place for very strong metrics, typically the $C^{\infty}$-convergence of densities. The talk is based on recent works with R. Herry, D. Malicet and G. Poly.
15h00 Nicolas Juillet (Univ. Haute-Alsace)
Title : Exact interpolation of 1-marginals
Résumé : I shall present a new type of martingales that exactly interpolates any given family of 1-dimensional marginals on R1 (satisfying the suitable necessary assumption). The construction makes use of ideas from the (martingale) optimal transportation theory and relies on different stochastic orders. I shall discuss of related constructions and open questions (joint work with Brückerhoff and Huesmann).
16h00 Kolehe Coulibaly-Pasquier (Inst. Ellie Cartan)
Title :
Résumé :
Jeudi 11/07
10h00 Masha Gordina (Univ. of Connecticut)
11h00 Simon Coste (Univ. de Paris)
À preciser