Improving Sequential Recommendations via Bidirectional Temporal Data Augmentation With Pre-Training

  • Juyong Jiang
  • , Peiyan Zhang
  • , Yingtao Luo
  • , Chaozhuo Li*
  • , Jae Boum Kim
  • , Kai Zhang*
  • , Senzhang Wang
  • , Sunghun Kim
  • , Philip S. Yu
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Sequential recommendation systems are integral to discerning temporal user preferences. Yet, the task of learning from abbreviated user interaction sequences poses a notable challenge. Data augmentation has been identified as a potent strategy to enhance the informational richness of these sequences. Traditional augmentation techniques, such as item randomization, may disrupt the inherent temporal dynamics. Although recent advancements in reverse chronological pseudo-item generation have shown promise, they can introduce temporal discrepancies when assessed in a natural chronological context. In response, we introduce a sophisticated approach, Bidirectional temporal data Augmentation with pre-training (BARec). Our approach leverages bidirectional temporal augmentation and knowledge-enhanced fine-tuning to synthesize authentic pseudo-prior items that retain user preferences and capture deeper item semantic correlations, thus boosting the model's expressive power. Our comprehensive experimental analysis on five benchmark datasets confirms the superiority of BARec across both short and elongated sequence contexts. Moreover, theoretical examination and case study offer further insight into the model's logical processes and interpretability.

Original languageEnglish
Pages (from-to)2652-2664
Number of pages13
JournalIEEE Transactions on Knowledge and Data Engineering
Volume37
Issue number5
DOIs
StatePublished - 2025

Keywords

  • Sequential recommendation
  • data augmentation
  • model pre-training

Fingerprint

Dive into the research topics of 'Improving Sequential Recommendations via Bidirectional Temporal Data Augmentation With Pre-Training'. Together they form a unique fingerprint.

Cite this