Date: 2019-01-25
Time: 16:00-17:00
Location: BURN 920
Abstract:
We have witnessed a lot of exciting development of data science in recent years. From the perspective of optimization, many modern data-science problems involve some basic ``non’’-properties that lack systematic treatment by the current approaches for the sake of the computation convenience. These non-properties include the coupling of the non-convexity, non-differentiability and non-determinism. In this talk, we present rigorous computational methods for solving two typical non-problems: the piecewise linear regression and the feed-forward deep neural network. The algorithmic framework is an integration of the first order non-convex majorization-minimization method and the second order non-smooth Newton methods. Numerical experiments demonstrate the effectiveness of our proposed approach. Contrary to existing methods for solving non-problems which provide at best very weak guarantees on the computed solutions obtained in practical implementation, our rigorous mathematical treatment aims to understand properties of these computed solutions with reference to both the empirical and the population risk minimizations.
This is based on joint work with Jong-Shi Pang, Bodhisattva Sen and Ziyu He.
Speaker
Ying Cui is currently a postdoc research associate in the Department of Industrial and Systems Engineering at the University of Southern California, working with Professor Jong-Shi Pang. She completed her Ph.D in the Department of Mathematics at the National University of Singapore in 2016, under the supervisions of Professor Defeng Sun and Professor Chenlei Leng.