On marginal sliced inverse regression for ultrahigh dimensional model-free feature selection

Zhou Yu, Yuexiao Dong, Jun Shao

Research output: Contribution to journalArticlepeer-review

21 Scopus citations

Abstract

Model-free variable selection has been implemented under the sufficient dimension reduction framework since the seminal paper of Cook [Ann. Statist. 32 (2004) 1062-1092]. In this paper, we extend the marginal coordinate test for sliced inverse regression (SIR) in Cook (2004) and propose a novel marginal SIR utility for the purpose of ultrahigh dimensional feature selection. Two distinct procedures, Dantzig selector and sparse precision matrix estimation, are incorporated to get two versions of sample level marginal SIR utilities. Both procedures lead to model-free variable selection consistency with predictor dimensionality p diverging at an exponential rate of the sample size n. As a special case of marginal SIR, we ignore the correlation among the predictors and propose marginal independence SIR. Marginal independence SIR is closely related to many existing independence screening procedures in the literature, and achieves model-free screening consistency in the ultrahigh dimensional setting. The finite sample performances of the proposed procedures are studied through synthetic examples and an application to the small round blue cell tumors data.

Original languageEnglish
Pages (from-to)2594-2623
Number of pages30
JournalAnnals of Statistics
Volume44
Issue number6
DOIs
StatePublished - Dec 2016

Keywords

  • Marginal coordinate test
  • Sliced inverse regression
  • Sufficient dimension reduction
  • Sure independence screening

Fingerprint

Dive into the research topics of 'On marginal sliced inverse regression for ultrahigh dimensional model-free feature selection'. Together they form a unique fingerprint.

Cite this