DEEP NONLINEAR SUFFICIENT DIMENSION REDUCTION

  • Yinfeng Chen
  • , Yuling Jiao
  • , Rui Qiu
  • , Zhou Yu

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

Linear sufficient dimension reduction, as exemplified by sliced inverse regression, has seen substantial development in the past thirty years. However, with the advent of more complex scenarios, nonlinear dimension reduction has gained considerable interest recently. This paper introduces a novel method for nonlinear sufficient dimension reduction, utilizing the generalized martingale difference divergence measure in conjunction with deep neural networks. The optimal solution of the proposed objective function is shown to be unbiased at the general level of σ-fields. And two optimization schemes, based on the fascinating deep neural networks, exhibit higher efficiency and flexibility compared to the classical eigendecomposition of linear operators. Moreover, we systematically investigate the slow rate and fast rate for the estimation error based on advanced U-process theory. Remarkably, the fast rate almost coincides with the minimax rate of nonparametric regression. The validity of our deep nonlinear sufficient dimension reduction methods is demonstrated through simulations and real data analysis.

Original languageEnglish
Pages (from-to)1201-1226
Number of pages26
JournalAnnals of Statistics
Volume52
Issue number3
DOIs
StatePublished - 1 Jun 2024

Keywords

  • Sufficient dimension reduction
  • U-process
  • deep neural networks
  • generalized martingale difference divergence

Fingerprint

Dive into the research topics of 'DEEP NONLINEAR SUFFICIENT DIMENSION REDUCTION'. Together they form a unique fingerprint.

Cite this