RecLGB: Enhancing LightGBM using Recursive VAE with Mixed Attention for Time-Series Forecasting

  • Yuxin Mei
  • , Xu Han
  • , Zhongming Han
  • , Li Han
  • , Jing Liu*
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Time-series forecasting demands efficient modeling of long-range dependencies while maintaining computational practicality. Current methods, particularly deep learning approaches, often struggle with complexity and scalability when handling extensive historical sequences. We propose RecLGB, a hybrid framework that synergizes LightGBM's efficiency with a memory-augmented deep architecture. At its core, RecLGB integrates a recursive Variational Autoencoder (VAE) enhanced by a mixed attention mechanism, which preserves temporal order through linear inductive biases while dynamically capturing dependencies via self-attention. The recursive VAE compresses lengthy historical sequences into compact hierarchical representations, serving as an external memory for LightGBM to leverage without computational overload. RecLGB is evaluated on five real-world datasets, and experiments demonstrate RecLGB's superiority, achieving superior accuracy and faster inference than Transformer-based baselines. This work bridges deep sequential modeling with gradient-boosted trees, offering a scalable, interpretable solution for resource-constrained forecasting. Code is available at: https://github.com/Mayer-myx/RecLGB.

Original languageEnglish
Title of host publicationInternational Joint Conference on Neural Networks, IJCNN 2025 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9798331510428
DOIs
StatePublished - 2025
Event2025 International Joint Conference on Neural Networks, IJCNN 2025 - Rome, Italy
Duration: 30 Jun 20255 Jul 2025

Publication series

NameProceedings of the International Joint Conference on Neural Networks
ISSN (Print)2161-4393
ISSN (Electronic)2161-4407

Conference

Conference2025 International Joint Conference on Neural Networks, IJCNN 2025
Country/TerritoryItaly
CityRome
Period30/06/255/07/25

Keywords

  • attention mechanism
  • LightGBM
  • time series forecasting
  • Transformer
  • VAE

Fingerprint

Dive into the research topics of 'RecLGB: Enhancing LightGBM using Recursive VAE with Mixed Attention for Time-Series Forecasting'. Together they form a unique fingerprint.

Cite this