Date: 2013-04-05

Time: 14:30-15:30

Location: BURN 1205

Abstract:

We consider the problem of predictive density estimation under Kullback-Leibler loss when the parameter space is restricted to a convex subset. The principal situation analyzed relates to the estimation of an unknown predictive p-variate normal density based on an observation generated by another p-variate normal density. The means of the densities are assumed to coincide, the covariance matrices are a known multiple of the identity matrix. We obtain sharp results concerning plug-in estimators, we show that the best unrestricted invariant predictive density estimator is dominated by the Bayes estimator associated with a uniform prior on the restricted parameter space, and we obtain minimax results for cases where the parameter space is (i) a cone, and (ii) a ball. A key feature, which we will describe, is a correspondence between the predictive density estimation problem with a collection of point estimation problems. Finally, if time permits, we describe recent work concerning : (i) non-normal models, and (ii) analysis relative to other loss functions such as reverse Kullback-Leibler and integrated L2.

References.

  1. Dominique Fourdrinier, Éric Marchand, Ali Righi, William E. Strawderman. On improved predictive density estimation with parametric constraints, Electronic Journal of Statistics 2011, Vol. 5, 172-191.
  2. Tatsuya Kubokawa, Éric Marchand, William E. Strawderman, Jean-Philippe Turcotte. Minimaxity in predictive density estimation with parametric constraints. Journal of Multivariate Analysis, 2013, Vol. 116, 382-397.

Speaker

Éric Marchand is a Professor of Statistics at the Université de Sherbrooke.