TY - JOUR
T1 - Towards a General Time Series Forecasting Model with Unified Representation and Adaptive Transfer
AU - Wang, Yihang
AU - Qiu, Yuying
AU - Chen, Peng
AU - Zhao, Kai
AU - Shu, Yang
AU - Rao, Zhongwen
AU - Pan, Lujia
AU - Yang, Bin
AU - Guo, Chenjuan
N1 - Publisher Copyright:
© 2025 by the author(s).
PY - 2025
Y1 - 2025
N2 - With the growing availability of multi-domain time series data, there is an increasing demand for general forecasting models pre-trained on multi- source datasets to support diverse downstream pre- diction scenarios. Existing time series foundation models primarily focus on scaling up pre-training datasets and model sizes to enhance generalization performance. In this paper, we take a different approach by addressing two critical aspects of general forecasting models: (1) how to derive unified representations from heterogeneous multidomain time series data, and (2) how to effectively capture domain-specific features to enable adaptive transfer across various downstream scenarios. To address the first aspect, we propose Decomposed Frequency Learning as the pre-training task, which leverages frequency-based masking and re-construction to decompose coupled semantic in- formation in time series, resulting in unified representations across domains. For the second aspect, we introduce the Time Series Register, which captures domain-specific representations during pre-training and enhances adaptive transferability to downstream tasks. Our model achieves the state-of-the-art forecasting performance on seven real-world benchmarks, demonstrating remark- able few-shot and zero-shot capabilities.
AB - With the growing availability of multi-domain time series data, there is an increasing demand for general forecasting models pre-trained on multi- source datasets to support diverse downstream pre- diction scenarios. Existing time series foundation models primarily focus on scaling up pre-training datasets and model sizes to enhance generalization performance. In this paper, we take a different approach by addressing two critical aspects of general forecasting models: (1) how to derive unified representations from heterogeneous multidomain time series data, and (2) how to effectively capture domain-specific features to enable adaptive transfer across various downstream scenarios. To address the first aspect, we propose Decomposed Frequency Learning as the pre-training task, which leverages frequency-based masking and re-construction to decompose coupled semantic in- formation in time series, resulting in unified representations across domains. For the second aspect, we introduce the Time Series Register, which captures domain-specific representations during pre-training and enhances adaptive transferability to downstream tasks. Our model achieves the state-of-the-art forecasting performance on seven real-world benchmarks, demonstrating remark- able few-shot and zero-shot capabilities.
UR - https://www.scopus.com/pages/publications/105023504780
M3 - 会议文章
AN - SCOPUS:105023504780
SN - 2640-3498
VL - 267
SP - 64127
EP - 64151
JO - Proceedings of Machine Learning Research
JF - Proceedings of Machine Learning Research
T2 - 42nd International Conference on Machine Learning, ICML 2025
Y2 - 13 July 2025 through 19 July 2025
ER -