Date: 2019-11-29
Time: 15:30-16:30
Location: BURN 1205
Abstract:
An Euler discretization of the Langevin diffusion is known to converge to the global minimizers of certain convex and non-convex optimization problems. We show that this property holds for any suitably smooth diffusion and that different diffusions are suitable for optimizing different classes of convex and non-convex functions. This allows us to design diffusions suitable for globally optimizing convex and non-convex functions not covered by the existing Langevin theory. Our non-asymptotic analysis delivers computable optimization and integration error bounds based on easily accessed properties of the objective and chosen diffusion. Central to our approach are new explicit Stein factor bounds on the solutions of Poisson equations. We complement these results with improved optimization guarantees for targets other than the standard Gibbs measure.
Speaker
Murat A. Erdogdu is an assistant professor at the University of Toronto in Departments of Computer Science and Statistical Sciences. He is a faculty member of the Machine Learning Group and the Vector Institute, and a CIFAR Chair in Artificial Intelligence. Before, he was a postdoctoral researcher at Microsoft Research - New England. He did his Ph.D. at Department of Statistics at Stanford University where he was jointly advised by Mohsen Bayati and Andrea Montanari. He has an M.S. degree in Computer Science from Stanford, and B.S. degrees in Electrical Engineering and Mathematics, both from Bogazici University.