Fitting jump additive models

Yicheng Kang, Yueyong Shi, Yuling Jiao, Wendong Li, Dongdong Xiang

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Jump regression analysis (JRA) provides a useful tool for estimating discontinuous functional relationships between a response and predictors. Most existing JRA methods consider the problems where there is only one or two predictors. It is unclear whether these methods can be directly extended to cases where there are multiple predictors. A jump additive model and a jump-preserving backfitting procedure are proposed. Jump additive models have the appeal that they make no restrictive parametric assumptions and allow possible discontinuities in the functional relationships, as with univariate JRA methods, but unlike them, jump additive models easily accommodate multiple predictors and the effects of individual predictors on the response can still be visually interpreted, regardless of the number of predictors. The proposed fitting procedure achieves the jump-preserving property by adaptively choosing, in each iteration of the backfitting algorithm, among two one-sided local linear estimates and a two-sided local linear estimate. Theoretical justifications and numerical studies show that it works well in applications. The procedure is also illustrated in analyzing a real data set.

Original languageEnglish
Article number107266
JournalComputational Statistics and Data Analysis
Volume162
DOIs
StatePublished - Oct 2021

Keywords

  • Backfitting
  • Jump-preserving estimation
  • Nonparametric regression
  • Smoothing
  • Weighted residual mean squares

Fingerprint

Dive into the research topics of 'Fitting jump additive models'. Together they form a unique fingerprint.

Cite this