Convergence rates for diffusions-based sampling and optimization methods
Murat A. Erdogdu · Nov 29, 2019
Date: 2019-11-29 Time: 15:30-16:30 Location: BURN 1205 Abstract: An Euler discretization of the Langevin diffusion is known to converge to the global minimizers of certain convex and non-convex optimization problems. We show that this property holds for any suitably smooth diffusion and that different diffusions are suitable for optimizing different classes of convex and non-convex functions. This allows us to design diffusions suitable for globally optimizing convex and non-convex functions not covered by the existing Langevin theory.