Logarithmic divergence: from finance to optimal transport and information geometry
Ting-Kam Leonard Wong · Nov 15, 2019
Date: 2019-11-15
Time: 15:30-16:30
Location: BURN 1205
Abstract:
Divergences, such as the Kullback-Leibler divergence, are distance-like quantities which arise in many applications in probability, statistics and data science. We introduce a family of logarithmic divergences which is a non-linear extension of the celebrated Bregman divergence. It is defined for any exponentially concave function (a function whose exponential is concave). We motivate this divergence by mathematical finance and large deviations of Dirichlet process. It also arises naturally from the solution to an optimal transport problem. The logarithmic divergence enjoys remarkable mathematical properties including a generalized Pythagorean theorem in the sense of information geometry, and induces a generalized exponential family of probability densities. In the last part of the talk we present a new differential geometric framework which connects optimal transport and information geometry. Joint works with Soumik Pal and Jiaowen Yang.