Date: 2014-02-14

Time: 15:30-16:30

Location: BURN 1205

Abstract:

Hellinger distance and its variants have long been used in the theory of robust statistics to develop inferential tools that are more robust than the maximum likelihood but as ecient as the MLE when the posited model holds. A key aspect of this alternative approach requires speci cation of a parametric family, which is usually not feasible in the context of problems involving complex data structures wherein estimating equations are typically used for inference. In this presentation, we describe how to extend the scope of divergence theory for inferential problems involving estimating equations and describe useful algorithms for their computation. Additionally, we theoretically study the robustness properties of the methods and establish the semi-parametric eciency of the new divergence based estimators under suitable technical conditions. Finally, we use the proposed methods to develop robust sure screening methods for ultra high dimensional problems. Theory of large deviations, convexity theory, and concentration inequalities play an essential role in the theoretical analysis and numerical development. Applications from equine parasitology, stochastic optimization, and antimicrobial resistance will be used to describe various aspects of the proposed methods.

Speaker

Anand N. Vidyashankar is an Associate Professor in the Department of Statistics at George Mason University.