Partial Dynamic Dimension Reduction for Conditional Mean in Regression

  • Shengjin Gan
  • , Zhou Yu*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

In many regression analysis, the authors are interested in regression mean of response variate given predictors, not its the conditional distribution. This paper is concerned with dimension reduction of predictors in sense of mean function of response conditioning on predictors. The authors introduce the notion of partial dynamic central mean dimension reduction subspace, different from central mean dimension reduction subspace, it has varying subspace in the domain of predictors, and its structural dimensionality may not be the same point by point. The authors study the property of partial dynamic central mean dimension reduction subspace, and develop estimated methods called dynamic ordinary least squares and dynamic principal Hessian directions, which are extension of ordinary least squares and principal Hessian directions based on central mean dimension reduction subspace. The kernel estimate methods for dynamic ordinary least squares and dynamic Principal Hessian Directions are employed, and large sample properties of estimators are given under the regular conditions. Simulations and real data analysis demonstrate that they are effective.

Original languageEnglish
Pages (from-to)1585-1601
Number of pages17
JournalJournal of Systems Science and Complexity
Volume33
Issue number5
DOIs
StatePublished - 1 Oct 2020

Keywords

  • Dynamic ordinary least square estimate
  • dynamic principal Hessian directions
  • kernel estimate
  • partial dimension reduction

Fingerprint

Dive into the research topics of 'Partial Dynamic Dimension Reduction for Conditional Mean in Regression'. Together they form a unique fingerprint.

Cite this