By selecting variables or models via information criteria or other formal methods a single selected model is the winner. Often, this winning model is wrongly treated as if it was known before the start of the analysis that precisely this model would get selected. Randomness is involved with the selection. Indeed, with a different sample of data another model could have been selected. For the popular method of the Akaike information criterion (AIC), the asymptotic distribution of parameter estimators after model selection is studied. The overselection property of this criterion is exploited to construct a selection region, and to obtain the asymptotic distribution of parameter estimators and linear combinations thereof in the selected model. The proposed method does not require the true model to be in the model set. We investigate the method in linear and generalized linear models. Confidence curves provide a broader picture of the selection methods post-selection.
This is joint work with A. Charkhi and A. Garcia Angulo.