/tags/2020-fall/index.xml 2020 Fall - McGill Statistics Seminars
  • Large-scale Network Inference

    Date: 2020-09-25 Time: 14:00-15:00 Zoom Link Meeting ID: 939 4707 7997 Passcode: no password Abstract: Network data is prevalent in many contemporary big data applications in which a common interest is to unveil important latent links between different pairs of nodes. Yet a simple fundamental question of how to precisely quantify the statistical uncertainty associated with the identification of latent links still remains largely unexplored. In this paper, we propose the method of statistical inference on membership profiles in large networks (SIMPLE) in the setting of degree-corrected mixed membership model, where the null hypothesis assumes that the pair of nodes share the same profile of community memberships.
  • BdryGP: a boundary-integrated Gaussian process model for computer code emulation

    Date: 2020-09-18 Time: 15:30-16:30 Zoom Link Meeting ID: 924 5390 4989 Passcode: 690084 Abstract: With advances in mathematical modeling and computational methods, complex phenomena (e.g., universe formations, rocket propulsion) can now be reliably simulated via computer code. This code solves a complicated system of equations representing the underlying science of the problem. Such simulations can be very time-intensive, requiring months of computation for a single run. Gaussian processes (GPs) are widely used as predictive models for “emulating” this expensive computer code.
  • Machine Learning for Causal Inference

    Date: 2020-09-11 Time: 16:00-17:00 Zoom Link Meeting ID: 965 2536 7383 Passcode: 421254 Abstract: Given advances in machine learning over the past decades, it is now possible to accurately solve difficult non-parametric prediction problems in a way that is routine and reproducible. In this talk, I’ll discuss how machine learning tools can be rigorously integrated into observational study analyses, and how they interact with classical statistical ideas around randomization, semiparametric modeling, double robustness, etc.