TY - JOUR
T1 - Model averaging assisted sufficient dimension reduction
AU - Fang, Fang
AU - Yu, Zhou
N1 - Publisher Copyright:
© 2020 Elsevier B.V.
PY - 2020/12
Y1 - 2020/12
N2 - Sufficient dimension reduction that replaces original predictors with their low- dimensional linear combinations without loss of information is a critical tool in modern statistics and has gained considerable research momentum in the past decades since the two pioneers sliced inverse regression and principal Hessian directions. The classical sufficient dimension reduction methods do not handle sparse case well since the estimated linear reductions involve all of the original predictors. Sparse sufficient dimension reduction methods rely on sparsity assumption which may not be true in practice. Motivated by the least squares formulation of the classical sliced inverse regression and principal Hessian directions, several model averaging assisted sufficient dimension reduction methods are proposed. They are applicable to both dense and sparse cases even with weak signals since model averaging adaptively assigns weights to different candidate models. Based on the model averaging assisted sufficient dimension reduction methods, how to estimate the structural dimension is further studied. Theoretical justifications are given and empirical results show that the proposed methods compare favorably with the classical sufficient dimension reduction methods and popular sparse sufficient dimension reduction methods.
AB - Sufficient dimension reduction that replaces original predictors with their low- dimensional linear combinations without loss of information is a critical tool in modern statistics and has gained considerable research momentum in the past decades since the two pioneers sliced inverse regression and principal Hessian directions. The classical sufficient dimension reduction methods do not handle sparse case well since the estimated linear reductions involve all of the original predictors. Sparse sufficient dimension reduction methods rely on sparsity assumption which may not be true in practice. Motivated by the least squares formulation of the classical sliced inverse regression and principal Hessian directions, several model averaging assisted sufficient dimension reduction methods are proposed. They are applicable to both dense and sparse cases even with weak signals since model averaging adaptively assigns weights to different candidate models. Based on the model averaging assisted sufficient dimension reduction methods, how to estimate the structural dimension is further studied. Theoretical justifications are given and empirical results show that the proposed methods compare favorably with the classical sufficient dimension reduction methods and popular sparse sufficient dimension reduction methods.
KW - Jackknife model averaging
KW - Ladle estimator
KW - Mallows model averaging
KW - Principal Hessian directions
KW - Sliced inverse regression
KW - Sufficient dimension reduction
UR - https://www.scopus.com/pages/publications/85087202695
U2 - 10.1016/j.csda.2020.106993
DO - 10.1016/j.csda.2020.106993
M3 - 文章
AN - SCOPUS:85087202695
SN - 0167-9473
VL - 152
JO - Computational Statistics and Data Analysis
JF - Computational Statistics and Data Analysis
M1 - 106993
ER -