Variable-dependent partial dimension reduction

Lu Li, Kai Tan, Xuerong Meggie Wen, Zhou Yu

Research output: Contribution to journalArticlepeer-review

Abstract

Sufficient dimension reduction reduces the dimension of a regression model without loss of information by replacing the original predictor with its lower-dimensional linear combinations. Partial (sufficient) dimension reduction arises when the predictors naturally fall into two sets X and W, and pursues a partial dimension reduction of X. Though partial dimension reduction is a very general problem, only very few research results are available when W is continuous. To the best of our knowledge, none can deal with the situation where the reduced lower-dimensional subspace of X varies with W. To address such issue, we in this paper propose a novel variable-dependent partial dimension reduction framework and adapt classical sufficient dimension reduction methods into this general paradigm. The asymptotic consistency of our method is investigated. Extensive numerical studies and real data analysis show that our variable-dependent partial dimension reduction method has superior performance compared to the existing methods.

Original languageEnglish
Pages (from-to)521-541
Number of pages21
JournalTest
Volume32
Issue number2
DOIs
StatePublished - Jun 2023

Keywords

  • Directional regression
  • Order determination
  • Sliced average variance estimation
  • Sliced inverse regression
  • Sufficient dimension reduction

Fingerprint

Dive into the research topics of 'Variable-dependent partial dimension reduction'. Together they form a unique fingerprint.

Cite this