TY - GEN
T1 - HWSformer
T2 - 2024 International Joint Conference on Neural Networks, IJCNN 2024
AU - Hu, Yisheng
AU - Cao, Guitao
AU - Cheng, Dawei
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - After the Transformer model demonstrated excellent performance in natural language processing (NLP) tasks and computer vision tasks, people have started to explore the use of Transformer models in the field of time series prediction. Because of the significant role of the stock market in the global economy, stock market prediction is of paramount importance for investors. Stock indices forecasting is one of the fields of stock market forecasting and researchers have also set their sights on Transformer. However, with limited semantic information available in time series data and the unique characteristics of the self-attention mechanism, the Transformer model has not gained widespread adoption in stock indices forecasting. In this paper, we propose a history window serialization based Transformer model (HWSformer) specifically designed for predicting stock price indices. Our innovation is to introduce the historical window serialization layer to solve the problem of limited semantic richness in time series data, which affects the validity of self-attention. Additionally, in order to capture the original distribution accurately and retain the valuable non-stationary information, we incorporate the Reversible Instance Normalization (RevIN) method. We conducted experiments on 12 stock price index datasets collected from multiple countries and demonstrated that HWSformer outperforms traditional Transformer models by approximately 20% and varying degrees of improvement compared to other recent variants of Transformers.
AB - After the Transformer model demonstrated excellent performance in natural language processing (NLP) tasks and computer vision tasks, people have started to explore the use of Transformer models in the field of time series prediction. Because of the significant role of the stock market in the global economy, stock market prediction is of paramount importance for investors. Stock indices forecasting is one of the fields of stock market forecasting and researchers have also set their sights on Transformer. However, with limited semantic information available in time series data and the unique characteristics of the self-attention mechanism, the Transformer model has not gained widespread adoption in stock indices forecasting. In this paper, we propose a history window serialization based Transformer model (HWSformer) specifically designed for predicting stock price indices. Our innovation is to introduce the historical window serialization layer to solve the problem of limited semantic richness in time series data, which affects the validity of self-attention. Additionally, in order to capture the original distribution accurately and retain the valuable non-stationary information, we incorporate the Reversible Instance Normalization (RevIN) method. We conducted experiments on 12 stock price index datasets collected from multiple countries and demonstrated that HWSformer outperforms traditional Transformer models by approximately 20% and varying degrees of improvement compared to other recent variants of Transformers.
KW - Time series data
KW - Transformer-based
KW - stock price indices forecasting
UR - https://www.scopus.com/pages/publications/85204963416
U2 - 10.1109/IJCNN60899.2024.10650506
DO - 10.1109/IJCNN60899.2024.10650506
M3 - 会议稿件
AN - SCOPUS:85204963416
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - 2024 International Joint Conference on Neural Networks, IJCNN 2024 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 30 June 2024 through 5 July 2024
ER -