A Memory Guided Transformer for Time Series Forecasting

  • Yunyao Cheng
  • , Chenjuan Guo*
  • , Bin Yang
  • , Haomin Yu
  • , Kai Zhao
  • , Christian S. Jensen
  • *Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

3 Scopus citations

Abstract

Accurate long-term forecasting from multivariate time series has important real-world applications. However, achieving this so is challenging. Thus, analyses reveal that time series that span long durations often exhibit dynamic and disrupted correlations. State-of-the-art methods employ attention mechanisms to capture dynamic correlations, but they often do not contend well with disrupted correlations, which reduces prediction accuracy. We introduce local and global information concepts and then leverage these in a Memory Guided Transformer, called the Mem former. By integrating patch-wise recurrent graph learning and global attention, the Mem former aims to capture dynamic correlations and take disrupted correlations into account. We also integrate a so-called Alternating Memory Enhancer into the Mem former to capture correlations between local and global information. We report on experiments that offer insight into the effectiveness of the Mem former at capturing dynamic correlations and its robustness to disrupted correlations. The experiments offer evidence that the new method is capable of advancing the state-of-the-art in forecasting accuracy on real-world datasets.

Original languageEnglish
Pages (from-to)239-252
Number of pages14
JournalProceedings of the VLDB Endowment
Volume18
Issue number2
DOIs
StatePublished - 2025
Event51st International Conference on Very Large Data Bases, VLDB 2025 - London, United Kingdom
Duration: 1 Sep 20255 Sep 2025

Fingerprint

Dive into the research topics of 'A Memory Guided Transformer for Time Series Forecasting'. Together they form a unique fingerprint.

Cite this