Variable selection in a class of single-index models

  • Li Ping Zhu*
  • , Lin Yi Qian
  • , Jin Guan Lin
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

24 Scopus citations

Abstract

In this paper we discuss variable selection in a class of single-index models in which we do not assume the error term as additive. Following the idea of sufficient dimension reduction, we first propose a unified method to recover the direction, then reformulate it under the least square framework. Differing from many other existing results associated with nonparametric smoothing methods for density function, the bandwidth selection in our proposed kernel function essentially has no impact on its root-n consistency or asymptotic normality. To select the important predictors, we suggest using the adaptive lasso method which is computationally efficient. Under some regularity conditions, the adaptive lasso method enjoys the oracle property in a general class of single-index models. In addition, the resulting estimation is shown to be asymptotically normal, which enables us to construct a confidence region for the estimated direction. The asymptotic results are augmented through comprehensive simulations, and illustrated by an analysis of air pollution data.

Original languageEnglish
Pages (from-to)1277-1293
Number of pages17
JournalAnnals of the Institute of Statistical Mathematics
Volume63
Issue number6
DOIs
StatePublished - Dec 2011

Keywords

  • Adaptive lasso
  • Dimension reduction
  • Oracle
  • Sliced inverse regression
  • Sparsity

Fingerprint

Dive into the research topics of 'Variable selection in a class of single-index models'. Together they form a unique fingerprint.

Cite this