Sequential citation counts prediction enhanced by dynamic contents

  • Guoxiu He*
  • , Sichen Gu
  • , Zhikai Xue
  • , Yufeng Duan
  • , Xiaomin Zhu
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

The assessment of the impact of scholarly publications has garnered significant attention among researchers, particularly in predicting the future sequence of citation counts. However, current studies predominantly regard academic papers as static entities, failing to acknowledge the dynamic nature of their fixed content, which can undergo shifts in focus over time. To this end, we implement dynamic representations of the content to mirror chronological changes within the given paper, facilitating the sequential prediction of citation counts. Specifically, we propose a novel deep neural network called DynamIc Content-aware TrAnsformer (DICTA). The proposed model incorporates a dynamic content module that leverages the power of a sequential module to effectively capture the evolving focus information within each paper. To account for dependencies between the historical and future citation counts, our model utilizes a transformer-based framework as the backbone. With the encoder-decoder structure, it can effectively handle previous citation accumulations and then predict future citation potentials. Extensive experiments conducted on two scientific datasets demonstrate that DICTA achieves impressive performance and outperforms all baseline approaches. Further analyses underscore the significance of the dynamic content module. The code is available: https://github.com/ECNU-Text-Computing/DICTA

Original languageEnglish
Article number101645
JournalJournal of Informetrics
Volume19
Issue number2
DOIs
StatePublished - May 2025

Keywords

  • Deep learning
  • Dynamic content
  • Sentence-BERT
  • Sequential citation prediction

Fingerprint

Dive into the research topics of 'Sequential citation counts prediction enhanced by dynamic contents'. Together they form a unique fingerprint.

Cite this