Date: 2025-12-05

Time: 15:30-16:30 (Montreal time)

Location: In person, Burnside 1104

https://mcgill.zoom.us/j/83026954715

Meeting ID: 830 2695 4715

Passcode: None

Abstract:

Shannon’s entropy is a cornerstone of information theory, quantifying uncertainty within a probability distribution. However, the classical definition may fail for distributions with heavy tails or infinite alphabets, leaving gaps in its theoretical foundation. This talk introduces a framework called Generalized Shannon’s Entropy (GSE), which extends the original concept to ensure well-definedness and robustness under broader conditions.

The talk begins by revisiting Shannon’s entropy and its limitations, followed by the construction of the GSE through escort distributions that adjust tail behavior. The asymptotic properties of plug-in estimators for GSE are discussed, including a central limit theorem that requires minimal assumptions. The talk then connects this generalization to mutual information, leading to tests of independence on a contingency table with asymptotic normality.

The second half explores the role of GSE in characterizing discrete probability distributions. Several recent results are reviewed, showing how finite or countable sets of entropic quantities can uniquely determine a distribution up to permutation. The talk concludes with open directions toward developing goodness-of-fit tests for discrete and disparate sample spaces using finite-order GSE characterization.

Speaker

Dr. Jialin Zhang is an Assistant Professor of Statistics at Mississippi State University. He received his Ph.D. in Statistics from the University of North Carolina at Charlotte in 2019 before joining Mississippi State University. His research focuses on nonparametric estimation of information-theoretic quantities and entropic approaches to statistical inference.