TY - JOUR
T1 - Graph-based Square-Root Estimation for Sparse Linear Regression
AU - Li, Peili
AU - Li, Zhuomei
AU - Xiao, Yunhai
AU - Ying, Chao
AU - Yu, Zhou
N1 - Publisher Copyright:
© 2025 American Statistical Association and Institute of Mathematical Statistics.
PY - 2025
Y1 - 2025
N2 - Sparse linear regression is one of the classic problems in the field of statistics, which has deep connections and high intersections with optimization, computation, and machine learning. To address the effective handling of high-dimensional data, the diversity of real noise, and the challenges in estimating standard deviation of noise, we propose a novel and general graph-based square-root estimation (GSRE) model for sparse linear regression. Specifically, we use square-root-loss function to encourage the estimators to be independent of the unknown standard deviation of error terms and design a sparse regularization term by using the graphical structure among predictors in a node-by-node form. Based on the predictor graphs with special structure, we highlight the generality by analyzing that the model in this article is equivalent to several classic regression models. Theoretically, we also analyze the finite sample bounds, asymptotic normality and model selection consistency of GSRE method without relying on standard deviation of error terms. In terms of computation, we employ the fast and efficient alternating direction method of multipliers. Finally, based on a large number of simulated and real data with various types of noise, we demonstrate the performance advantages of the proposed method in estimation, prediction and model selection. Supplementary materials for this article are available online.
AB - Sparse linear regression is one of the classic problems in the field of statistics, which has deep connections and high intersections with optimization, computation, and machine learning. To address the effective handling of high-dimensional data, the diversity of real noise, and the challenges in estimating standard deviation of noise, we propose a novel and general graph-based square-root estimation (GSRE) model for sparse linear regression. Specifically, we use square-root-loss function to encourage the estimators to be independent of the unknown standard deviation of error terms and design a sparse regularization term by using the graphical structure among predictors in a node-by-node form. Based on the predictor graphs with special structure, we highlight the generality by analyzing that the model in this article is equivalent to several classic regression models. Theoretically, we also analyze the finite sample bounds, asymptotic normality and model selection consistency of GSRE method without relying on standard deviation of error terms. In terms of computation, we employ the fast and efficient alternating direction method of multipliers. Finally, based on a large number of simulated and real data with various types of noise, we demonstrate the performance advantages of the proposed method in estimation, prediction and model selection. Supplementary materials for this article are available online.
KW - Alternating direction method of multipliers
KW - Graphical structure among predictors
KW - Oracle property
KW - Sparse linear regression
KW - Square-root-loss function
UR - https://www.scopus.com/pages/publications/105023899156
U2 - 10.1080/10618600.2025.2571164
DO - 10.1080/10618600.2025.2571164
M3 - 文章
AN - SCOPUS:105023899156
SN - 1061-8600
JO - Journal of Computational and Graphical Statistics
JF - Journal of Computational and Graphical Statistics
ER -