TY - JOUR
T1 - Fitting jump additive models
AU - Kang, Yicheng
AU - Shi, Yueyong
AU - Jiao, Yuling
AU - Li, Wendong
AU - Xiang, Dongdong
N1 - Publisher Copyright:
© 2021 Elsevier B.V.
PY - 2021/10
Y1 - 2021/10
N2 - Jump regression analysis (JRA) provides a useful tool for estimating discontinuous functional relationships between a response and predictors. Most existing JRA methods consider the problems where there is only one or two predictors. It is unclear whether these methods can be directly extended to cases where there are multiple predictors. A jump additive model and a jump-preserving backfitting procedure are proposed. Jump additive models have the appeal that they make no restrictive parametric assumptions and allow possible discontinuities in the functional relationships, as with univariate JRA methods, but unlike them, jump additive models easily accommodate multiple predictors and the effects of individual predictors on the response can still be visually interpreted, regardless of the number of predictors. The proposed fitting procedure achieves the jump-preserving property by adaptively choosing, in each iteration of the backfitting algorithm, among two one-sided local linear estimates and a two-sided local linear estimate. Theoretical justifications and numerical studies show that it works well in applications. The procedure is also illustrated in analyzing a real data set.
AB - Jump regression analysis (JRA) provides a useful tool for estimating discontinuous functional relationships between a response and predictors. Most existing JRA methods consider the problems where there is only one or two predictors. It is unclear whether these methods can be directly extended to cases where there are multiple predictors. A jump additive model and a jump-preserving backfitting procedure are proposed. Jump additive models have the appeal that they make no restrictive parametric assumptions and allow possible discontinuities in the functional relationships, as with univariate JRA methods, but unlike them, jump additive models easily accommodate multiple predictors and the effects of individual predictors on the response can still be visually interpreted, regardless of the number of predictors. The proposed fitting procedure achieves the jump-preserving property by adaptively choosing, in each iteration of the backfitting algorithm, among two one-sided local linear estimates and a two-sided local linear estimate. Theoretical justifications and numerical studies show that it works well in applications. The procedure is also illustrated in analyzing a real data set.
KW - Backfitting
KW - Jump-preserving estimation
KW - Nonparametric regression
KW - Smoothing
KW - Weighted residual mean squares
UR - https://www.scopus.com/pages/publications/85106211745
U2 - 10.1016/j.csda.2021.107266
DO - 10.1016/j.csda.2021.107266
M3 - 文章
AN - SCOPUS:85106211745
SN - 0167-9473
VL - 162
JO - Computational Statistics and Data Analysis
JF - Computational Statistics and Data Analysis
M1 - 107266
ER -