TY - JOUR
T1 - Partial Dynamic Dimension Reduction for Conditional Mean in Regression
AU - Gan, Shengjin
AU - Yu, Zhou
N1 - Publisher Copyright:
© 2020, The Editorial Office of JSSC & Springer-Verlag GmbH Germany.
PY - 2020/10/1
Y1 - 2020/10/1
N2 - In many regression analysis, the authors are interested in regression mean of response variate given predictors, not its the conditional distribution. This paper is concerned with dimension reduction of predictors in sense of mean function of response conditioning on predictors. The authors introduce the notion of partial dynamic central mean dimension reduction subspace, different from central mean dimension reduction subspace, it has varying subspace in the domain of predictors, and its structural dimensionality may not be the same point by point. The authors study the property of partial dynamic central mean dimension reduction subspace, and develop estimated methods called dynamic ordinary least squares and dynamic principal Hessian directions, which are extension of ordinary least squares and principal Hessian directions based on central mean dimension reduction subspace. The kernel estimate methods for dynamic ordinary least squares and dynamic Principal Hessian Directions are employed, and large sample properties of estimators are given under the regular conditions. Simulations and real data analysis demonstrate that they are effective.
AB - In many regression analysis, the authors are interested in regression mean of response variate given predictors, not its the conditional distribution. This paper is concerned with dimension reduction of predictors in sense of mean function of response conditioning on predictors. The authors introduce the notion of partial dynamic central mean dimension reduction subspace, different from central mean dimension reduction subspace, it has varying subspace in the domain of predictors, and its structural dimensionality may not be the same point by point. The authors study the property of partial dynamic central mean dimension reduction subspace, and develop estimated methods called dynamic ordinary least squares and dynamic principal Hessian directions, which are extension of ordinary least squares and principal Hessian directions based on central mean dimension reduction subspace. The kernel estimate methods for dynamic ordinary least squares and dynamic Principal Hessian Directions are employed, and large sample properties of estimators are given under the regular conditions. Simulations and real data analysis demonstrate that they are effective.
KW - Dynamic ordinary least square estimate
KW - dynamic principal Hessian directions
KW - kernel estimate
KW - partial dimension reduction
UR - https://www.scopus.com/pages/publications/85088877055
U2 - 10.1007/s11424-020-8329-3
DO - 10.1007/s11424-020-8329-3
M3 - 文章
AN - SCOPUS:85088877055
SN - 1009-6124
VL - 33
SP - 1585
EP - 1601
JO - Journal of Systems Science and Complexity
JF - Journal of Systems Science and Complexity
IS - 5
ER -