Date: 2013-11-15

Time: 15:30-16:30

Location: BURN 1205

Abstract:

In this talk, we consider estimation in generalized linear models when there are many potential predictors and some of them may not have influence on the response of interest. In the context of two competing models where one model includes all predictors and the other restricts variable coefficients to a candidate linear subspace based on subject matter or prior knowledge, we investigate the relative performances of Stein type shrinkage, pretest, and penalty estimators (L1GLM, adaptive L1GLM, and SCAD) with respect to the full model estimator. The asymptotic properties of the pretest and shrinkage estimators including the derivation of asymptotic distributional biases and risks are established. A Monte Carlo simulation study show that the mean squared error (MSE) of an adaptive shrinkage estimator is comparable to the MSE of the penalty estimators in many situations and in particular performs better than the penalty estimators when the model is sparse. A real data set analysis is also presented to compare the suggested methods.

Speaker

Syed Ejaz Ahmed is a Professor of Statistics and Dean of the Faculty of Mathematics and Science at Brock University, St. Catherines, ON.