Date: 2020-10-09

Time: 15:30-16:30

Zoom Link

Meeting ID: 924 5390 4989

Passcode: 690084

Abstract:

Statistical learning theory is by now a mature branch of data science that hosts a vast variety of practical techniques for tackling data-related problems. In this talk we present some fundamental concepts upon which statistical learning theory has been based. Different approaches to statistical inference will be discussed and the main problem of learning from Vapnik’s point of view will be explained. Further we discuss the topic of function estimation as the heart of Vapnik-Chervonenkis theory. There exist several state-of-the-art methods for estimating functional dependencies, such as maximum margin estimator and artificial neural networks. While for some of these methods, e.g., the support vector machines, there has already been developed a profound theory, others require more investigation. Accordingly, we pay a closer attention to the so-called mapping neural networks and try to shed some light on certain theoretical aspects of them. We highlight some of the fundamental challenges that have attracted the attention of researcher and they are yet to be fully resolved. One of these challenges is estimation of the intrinsic dimension of data that will be discussed in detail. Another challenge is inferring causal direction when the training data set is not representative of the target population.

Speaker

Masoud Asgharian is a Professor of Statistics in the Department of Mathematics and Statistics at McGill University. His research interests include survival analysis, change-point problems, classification and clustering, causal inference and optimization.

Damoon Robatian is a PhD student and Zedian Xiao is undergraduate student working with Professor Masoud Asgharian.