Date: 2019-11-15

Time: 15:30-16:30

Location: BURN 1205

Abstract:

Divergences, such as the Kullback-Leibler divergence, are distance-like quantities which arise in many applications in probability, statistics and data science. We introduce a family of logarithmic divergences which is a non-linear extension of the celebrated Bregman divergence. It is defined for any exponentially concave function (a function whose exponential is concave). We motivate this divergence by mathematical finance and large deviations of Dirichlet process. It also arises naturally from the solution to an optimal transport problem. The logarithmic divergence enjoys remarkable mathematical properties including a generalized Pythagorean theorem in the sense of information geometry, and induces a generalized exponential family of probability densities. In the last part of the talk we present a new differential geometric framework which connects optimal transport and information geometry. Joint works with Soumik Pal and Jiaowen Yang.

Speaker

Ting-Kam Leonard Wong is an assistant professor at the Department of Computer and Mathematical Sciences, University of Toronto Scarborough and the Department of Statistical Sciences, University of Toronto. He received his PhD in mathematics from University of Washington in 2016 under the supervison of Soumik Pal. Before coming to Toronto in July 2018, he was a non-tenure track assistant professor in financial mathematics at the Department of Mathematics, University of Southern California.