HWSformer: History Window Serialization Based Transformer for Semantic Enrichment Driven Stock Market Prediction

Yisheng Hu, Guitao Cao, Dawei Cheng

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

After the Transformer model demonstrated excellent performance in natural language processing (NLP) tasks and computer vision tasks, people have started to explore the use of Transformer models in the field of time series prediction. Because of the significant role of the stock market in the global economy, stock market prediction is of paramount importance for investors. Stock indices forecasting is one of the fields of stock market forecasting and researchers have also set their sights on Transformer. However, with limited semantic information available in time series data and the unique characteristics of the self-attention mechanism, the Transformer model has not gained widespread adoption in stock indices forecasting. In this paper, we propose a history window serialization based Transformer model (HWSformer) specifically designed for predicting stock price indices. Our innovation is to introduce the historical window serialization layer to solve the problem of limited semantic richness in time series data, which affects the validity of self-attention. Additionally, in order to capture the original distribution accurately and retain the valuable non-stationary information, we incorporate the Reversible Instance Normalization (RevIN) method. We conducted experiments on 12 stock price index datasets collected from multiple countries and demonstrated that HWSformer outperforms traditional Transformer models by approximately 20% and varying degrees of improvement compared to other recent variants of Transformers.

Original languageEnglish
Title of host publication2024 International Joint Conference on Neural Networks, IJCNN 2024 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9798350359312
DOIs
StatePublished - 2024
Event2024 International Joint Conference on Neural Networks, IJCNN 2024 - Yokohama, Japan
Duration: 30 Jun 20245 Jul 2024

Publication series

NameProceedings of the International Joint Conference on Neural Networks

Conference

Conference2024 International Joint Conference on Neural Networks, IJCNN 2024
Country/TerritoryJapan
CityYokohama
Period30/06/245/07/24

Keywords

  • Time series data
  • Transformer-based
  • stock price indices forecasting

Fingerprint

Dive into the research topics of 'HWSformer: History Window Serialization Based Transformer for Semantic Enrichment Driven Stock Market Prediction'. Together they form a unique fingerprint.

Cite this