跳到主要导航 跳到搜索 跳到主要内容

LightGTS: A Lightweight General Time Series Forecasting Model

  • Yihang Wang
  • , Yuying Qiu
  • , Peng Chen
  • , Yang Shu
  • , Zhongwen Rao
  • , Lujia Pan
  • , Bin Yang
  • , Chenjuan Guo*
  • *此作品的通讯作者

科研成果: 期刊稿件会议文章同行评审

摘要

Existing works on general time series forecasting build foundation models with heavy model parameters through large-scale multi-source pre-training. These models achieve superior generalization ability across various datasets at the cost of significant computational burdens and limitations in resource-constrained scenarios. This paper introduces LightGTS, a lightweight general time series forecasting model designed from the perspective of consistent periodical modeling. To handle diverse scales and intrinsic periods in multi-source pre-training,we introduce Periodical Tokenization, which extracts consistent periodic patterns across different datasets with varying scales. To better utilize the periodicity in the decoding process, we further introduce Periodical Parallel Decoding, which leverages historical tokens to improve forecasting. Based on the two techniques above which fully leverage the inductive bias of periods inherent in time series, LightGTS uses a lightweight model to achieve outstanding performance on general time series forecasting. It achieves state-of-the-art forecasting performance on 9 real-world benchmarks in both zero-shot and full-shot settings with much better efficiency compared with existing time series foundation models.

源语言英语
页(从-至)64109-64126
页数18
期刊Proceedings of Machine Learning Research
267
出版状态已出版 - 2025
活动42nd International Conference on Machine Learning, ICML 2025 - Vancouver, 加拿大
期限: 13 7月 202519 7月 2025

指纹

探究 'LightGTS: A Lightweight General Time Series Forecasting Model' 的科研主题。它们共同构成独一无二的指纹。

引用此