TY - JOUR
T1 - Transfer learning for high-dimensional data with heavy-tailed noise
T2 - A sparse convoluted rank regression method
AU - Yan, Yibo
AU - Ma, Qianli
AU - Zhang, Riquan
AU - Wang, Xiaozhou
N1 - Publisher Copyright:
© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2025.
PY - 2026/2
Y1 - 2026/2
N2 - Transfer learning can leverage information from the source domain to improve the estimation or prediction accuracy of the target task. For the high-dimensional linear regression model with sub-Gaussian noise, so-called Trans-Lasso algorithm has been proposed to boost the learning performance on the target domain. However, such algorithm may not lead to efficient estimates when the errors are heavy-tailed. In this paper, we investigate the penalized convoluted rank regression (CRR) under the transfer learning framework, aiming to provide robust estimators when dealing with heavy-tailed noise. The convolution smoothing technique improves the smoothness of the loss function without introducing any bias. In the high-dimensional setting, we first propose a transfer learning algorithm on the penalized CRR models with known transferable sources, and establish ℓ2/ℓ1-estimation error bounds for the corresponding estimators. Besides, we propose a transferable detection method to select informative sources and also verify its consistency. At last, we demonstrate the validity and effectiveness of our proposed methods using simulated data and a real-world dataset concerning the associations among gene expressions.
AB - Transfer learning can leverage information from the source domain to improve the estimation or prediction accuracy of the target task. For the high-dimensional linear regression model with sub-Gaussian noise, so-called Trans-Lasso algorithm has been proposed to boost the learning performance on the target domain. However, such algorithm may not lead to efficient estimates when the errors are heavy-tailed. In this paper, we investigate the penalized convoluted rank regression (CRR) under the transfer learning framework, aiming to provide robust estimators when dealing with heavy-tailed noise. The convolution smoothing technique improves the smoothness of the loss function without introducing any bias. In the high-dimensional setting, we first propose a transfer learning algorithm on the penalized CRR models with known transferable sources, and establish ℓ2/ℓ1-estimation error bounds for the corresponding estimators. Besides, we propose a transferable detection method to select informative sources and also verify its consistency. At last, we demonstrate the validity and effectiveness of our proposed methods using simulated data and a real-world dataset concerning the associations among gene expressions.
KW - Convolution-based smoothing
KW - Detection consistency
KW - Heavy-tailed noise
KW - High-dimensional dataset
KW - Transfer learning
KW - ℓ-penalized rank regression
UR - https://www.scopus.com/pages/publications/105025424425
U2 - 10.1007/s11222-025-10803-7
DO - 10.1007/s11222-025-10803-7
M3 - 文章
AN - SCOPUS:105025424425
SN - 0960-3174
VL - 36
JO - Statistics and Computing
JF - Statistics and Computing
IS - 1
M1 - 45
ER -