Date: 2024-11-15
Time: 15:30-16:30 (Montreal time)
Location: In person, Burnside 1104
https://mcgill.zoom.us/j/89043936588
Meeting ID: 890 4393 6588
Passcode: None
Abstract:
Deep learning is having a profound impact on industry and scientific research. Yet, while this paradigm continues to show impressive performance in a wide variety of applications, its mathematical foundations are far from being well understood. Motivated by deep learning methods for scientific computing, I will present new practical existence theorems that aim at bridging the gap between theory and practice in this area. Combining universal approximation results for deep neural networks with sparse high-dimensional polynomial approximation theory, these theorems identify sufficient conditions on the network architecture, the training strategy, and the size of the training set able to guarantee a target accuracy. I will illustrate practical existence theorems in the contexts of high-dimensional function approximation via feedforward networks, reduced order modeling based on convolutional autoencoders, and physics-informed neural networks for high-dimensional PDEs.
Speaker
Simone Brugiapaglia is an Associate Professor of Mathematics and Statistics at Concordia University. He received his PhD in Mathematical Models and Methods in Engineering from Politecnico di Milano (MOX Laboratory) in 2016. He was a post-doctoral fellow at École polytechnique fédérale de Lausanne (EPFL) in Spring 2016 and at Simon Fraser University from 2016 to 2019, where he held a Postdoctoral Training Centre in Stochastics Fellowship from the Pacific Institute for the Mathematical Sciences (PIMS) from 2016 to 2018. He was awarded a Leslie Fox Prize for Numerical Analysis by the Institute of Mathematics and its Applications (IMA) in 2019 and obtained the title of Concordia Research Fellow in 2023.
Dr. Brugiapaglia is the author of more than more than 30 scientific publications, including two books, book chapters, peer-reviewed journal articles, and conference proceedings. His work has been published in venues such as SIAM Journal on Mathematics of Data Science, Foundations of Computational Mathematics, Mathematics of Computation, Neural Computation, Numerische Mathematik, and IEEE Transactions on Information Theory. He has supervised more than 20 trainees at the post-doctoral, graduate and undergraduate levels. His research interests include mathematics of data science, machine learning, numerical analysis, and their applications.