TY - JOUR
T1 - A Memory Guided Transformer for Time Series Forecasting
AU - Cheng, Yunyao
AU - Guo, Chenjuan
AU - Yang, Bin
AU - Yu, Haomin
AU - Zhao, Kai
AU - Jensen, Christian S.
N1 - Publisher Copyright:
© 2025, VLDB Endowment. All rights reserved.
PY - 2025
Y1 - 2025
N2 - Accurate long-term forecasting from multivariate time series has important real-world applications. However, achieving this so is challenging. Thus, analyses reveal that time series that span long durations often exhibit dynamic and disrupted correlations. State-of-the-art methods employ attention mechanisms to capture dynamic correlations, but they often do not contend well with disrupted correlations, which reduces prediction accuracy. We introduce local and global information concepts and then leverage these in a Memory Guided Transformer, called the Mem former. By integrating patch-wise recurrent graph learning and global attention, the Mem former aims to capture dynamic correlations and take disrupted correlations into account. We also integrate a so-called Alternating Memory Enhancer into the Mem former to capture correlations between local and global information. We report on experiments that offer insight into the effectiveness of the Mem former at capturing dynamic correlations and its robustness to disrupted correlations. The experiments offer evidence that the new method is capable of advancing the state-of-the-art in forecasting accuracy on real-world datasets.
AB - Accurate long-term forecasting from multivariate time series has important real-world applications. However, achieving this so is challenging. Thus, analyses reveal that time series that span long durations often exhibit dynamic and disrupted correlations. State-of-the-art methods employ attention mechanisms to capture dynamic correlations, but they often do not contend well with disrupted correlations, which reduces prediction accuracy. We introduce local and global information concepts and then leverage these in a Memory Guided Transformer, called the Mem former. By integrating patch-wise recurrent graph learning and global attention, the Mem former aims to capture dynamic correlations and take disrupted correlations into account. We also integrate a so-called Alternating Memory Enhancer into the Mem former to capture correlations between local and global information. We report on experiments that offer insight into the effectiveness of the Mem former at capturing dynamic correlations and its robustness to disrupted correlations. The experiments offer evidence that the new method is capable of advancing the state-of-the-art in forecasting accuracy on real-world datasets.
UR - https://www.scopus.com/pages/publications/86000005823
U2 - 10.14778/3705829.3705842
DO - 10.14778/3705829.3705842
M3 - 会议文章
AN - SCOPUS:86000005823
SN - 2150-8097
VL - 18
SP - 239
EP - 252
JO - Proceedings of the VLDB Endowment
JF - Proceedings of the VLDB Endowment
IS - 2
T2 - 51st International Conference on Very Large Data Bases, VLDB 2025
Y2 - 1 September 2025 through 5 September 2025
ER -