Robust and computationally efficient gradient-based estimation

Yibo Yan, Xiaozhou Wang, Riquan Zhang

Research output: Contribution to journalArticlepeer-review

Abstract

In this paper, we propose a class of estimators based on the robust and computationally efficient gradient estimation for both low- and high-dimensional risk minimization framework. The gradient estimation in this work is constructed using a series of newly proposed univariate robust and efficient mean estimators. Our proposed estimators are obtained iteratively using a variant of the gradient descent method, where the update direction is determined by a robust and computationally efficient gradient. These estimators not only have explicit expressions and can be obtained through arithmetic operations but are also robust to arbitrary outliers in common statistical models. Theoretically, we establish the convergence of the algorithms and derive non-asymptotic error bounds for these iterative estimators. Specifically, we apply our methods to linear and logistic regression models, achieving robust parameter estimates and corresponding excess risk bounds. Unlike previous work, our theoretical results rely on a magnitude function of the outliers, which captures the extent of their deviation from the inliers. Finally, we present extensive simulation experiments on both low- and high-dimensional linear models to demonstrate the superior performance of our proposed estimators compared to several baseline methods.

Original languageEnglish
Article number106351
JournalJournal of Statistical Planning and Inference
Volume242
DOIs
StatePublished - May 2026

Keywords

  • Gradient descent
  • Hard thresholding
  • Non-asymptotic error bound
  • Outlier
  • Robustness

Fingerprint

Dive into the research topics of 'Robust and computationally efficient gradient-based estimation'. Together they form a unique fingerprint.

Cite this