Abstract
Sufficient dimension reduction reduces the dimension of a regression model without loss of information by replacing the original predictor with its lower-dimensional linear combinations. Partial (sufficient) dimension reduction arises when the predictors naturally fall into two sets X and W, and pursues a partial dimension reduction of X. Though partial dimension reduction is a very general problem, only very few research results are available when W is continuous. To the best of our knowledge, none can deal with the situation where the reduced lower-dimensional subspace of X varies with W. To address such issue, we in this paper propose a novel variable-dependent partial dimension reduction framework and adapt classical sufficient dimension reduction methods into this general paradigm. The asymptotic consistency of our method is investigated. Extensive numerical studies and real data analysis show that our variable-dependent partial dimension reduction method has superior performance compared to the existing methods.
| Original language | English |
|---|---|
| Pages (from-to) | 521-541 |
| Number of pages | 21 |
| Journal | Test |
| Volume | 32 |
| Issue number | 2 |
| DOIs | |
| State | Published - Jun 2023 |
Keywords
- Directional regression
- Order determination
- Sliced average variance estimation
- Sliced inverse regression
- Sufficient dimension reduction