TY - GEN
T1 - Paeformer
T2 - 28th European Conference on Artificial Intelligence, ECAI 2025, including 14th Conference on Prestigious Applications of Intelligent Systems, PAIS 2025
AU - Liu, Kun
AU - Duan, Zhongjie
AU - Chen, Cen
N1 - Publisher Copyright:
© 2025 The Authors.
PY - 2025/10/21
Y1 - 2025/10/21
N2 - Time series forecasting plays a critical role in various real-world applications, such as finance, climate science, and transportation. However, most existing studies adopt a channel-independent strategy, which, while avoiding the ambiguity of projecting multiple variates into indistinguishable channels, often neglects the cross-variate dependencies inherent in multivariate time series. This oversight limits the upper bound of forecasting accuracy. Therefore, effectively leveraging cross-variate relationships to obtain more expressive representations is a crucial yet underexplored challenge in time series forecasting. In this paper, we propose Paeformer, a novel model that captures generalized representations of time series patches by exploiting local cross-variate dependencies and applying implicit regularization via an overcomplete autoencoder framework. Specifically, we introduce a patch-based autoencoder composed of a Transformer-based encoder and an MLP-based decoder. The encoder captures local dependencies across variates, while the reconstruction loss computed on each patch is integrated into the overall loss function. This promotes consistent training between the encoder and decoder, and serves as an implicit regularization to constrain the high-dimensional representations of patches. Moreover, we replace the traditional feedforward decoding process with a novel patch-wise decoding mechanism, establishing a new paradigm of recurrent encoding and decoding based on patch-wise sequences. Experimental results on eight benchmark multivariate time series datasets demonstrate that Paeformer consistently outperforms all baseline methods, achieving state-of-the-art performance. Our code is publicly available at: https://github.com/iuaku/Paeformer.
AB - Time series forecasting plays a critical role in various real-world applications, such as finance, climate science, and transportation. However, most existing studies adopt a channel-independent strategy, which, while avoiding the ambiguity of projecting multiple variates into indistinguishable channels, often neglects the cross-variate dependencies inherent in multivariate time series. This oversight limits the upper bound of forecasting accuracy. Therefore, effectively leveraging cross-variate relationships to obtain more expressive representations is a crucial yet underexplored challenge in time series forecasting. In this paper, we propose Paeformer, a novel model that captures generalized representations of time series patches by exploiting local cross-variate dependencies and applying implicit regularization via an overcomplete autoencoder framework. Specifically, we introduce a patch-based autoencoder composed of a Transformer-based encoder and an MLP-based decoder. The encoder captures local dependencies across variates, while the reconstruction loss computed on each patch is integrated into the overall loss function. This promotes consistent training between the encoder and decoder, and serves as an implicit regularization to constrain the high-dimensional representations of patches. Moreover, we replace the traditional feedforward decoding process with a novel patch-wise decoding mechanism, establishing a new paradigm of recurrent encoding and decoding based on patch-wise sequences. Experimental results on eight benchmark multivariate time series datasets demonstrate that Paeformer consistently outperforms all baseline methods, achieving state-of-the-art performance. Our code is publicly available at: https://github.com/iuaku/Paeformer.
UR - https://www.scopus.com/pages/publications/105024440467
U2 - 10.3233/FAIA251135
DO - 10.3233/FAIA251135
M3 - 会议稿件
AN - SCOPUS:105024440467
T3 - Frontiers in Artificial Intelligence and Applications
SP - 2794
EP - 2801
BT - ECAI 2025 - 28th European Conference on Artificial Intelligence, including 14th Conference on Prestigious Applications of Intelligent Systems, PAIS 2025 - Proceedings
A2 - Lynce, Ines
A2 - Murano, Nello
A2 - Vallati, Mauro
A2 - Villata, Serena
A2 - Chesani, Federico
A2 - Milano, Michela
A2 - Omicini, Andrea
A2 - Dastani, Mehdi
PB - IOS Press BV
Y2 - 25 October 2025 through 30 October 2025
ER -