MixRecLGB: Language-Enhanced Mixed Attention for Temporal Context Modeling in Time Series Forecasting

Yuxin Mei, Luxi Zhang, Li Han, Jing Liu*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Accurate time series forecasting demands effective integration of temporal dynamics and contextual semantics. While existing attention mechanisms capture numerical patterns effectively, they often neglect domain-specific temporal knowledge. We propose MixRecLGB, a novel framework that synergizes LightGBM with a language-enhanced mixed attention mechanism. Our key contributions include: 1) A recursive VAE architecture (RecLGB) that compresses long historical sequences into hierarchical memory features through progressive latent space learning; 2) A temporal-semantic fusion mechanism that injects frozen language model embeddings into both static linear attention and dynamic self-attention components, preserving temporal order while incorporating contextual knowledge; 3) A parameter-efficient integration strategy that enhances attention computation through adaptive bias injection and feature fusion, requiring minimal architectural modifications. Evaluated on five real-world datasets, MixRecLGB reduces forecasting errors while maintaining computational efficiency. This work establishes a new paradigm for combining deep temporal modeling with efficient gradient boosting, particularly effective for long-term forecasting scenarios.

Original languageEnglish
Title of host publicationEngineering of Complex Computer Systems - 29th International Conference, ICECCS 2025, Proceedings
EditorsYuan Zhou, Zuohua Ding, Sin G. Teo, Xiaofei Xie, Yang Liu
PublisherSpringer Science and Business Media Deutschland GmbH
Pages79-97
Number of pages19
ISBN (Print)9783032008275
DOIs
StatePublished - 2026
Event29th International Conference on Engineering of Complex Computer Systems, ICECCS 2025 - Hangzhou, China
Duration: 2 Jul 20254 Jul 2025

Publication series

NameLecture Notes in Computer Science
Volume15746 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference29th International Conference on Engineering of Complex Computer Systems, ICECCS 2025
Country/TerritoryChina
CityHangzhou
Period2/07/254/07/25

Keywords

  • LightGBM
  • VAE
  • time series forecasting

Fingerprint

Dive into the research topics of 'MixRecLGB: Language-Enhanced Mixed Attention for Temporal Context Modeling in Time Series Forecasting'. Together they form a unique fingerprint.

Cite this