Abstract
Knowledge tracing (KT) models learners’ evolving knowledge states to predict future performance, serving as a fundamental component in personalized education systems. However, existing methods suffer from data sparsity challenges, resulting in inadequate representation quality for low-frequency knowledge concepts and inconsistent modeling of students’ actual knowledge states. To address this challenge, we propose Dual-Encoder Contrastive Knowledge Tracing (DECKT), a contrastive learning framework that improves knowledge state representation under sparse data conditions. DECKT employs a momentum-updated dual-encoder architecture where the primary encoder processes current input data while the momentum encoder maintains stable historical representations through exponential moving average updates. These encoders naturally form contrastive pairs through temporal evolution, effectively enhancing representation capabilities for low-frequency knowledge concepts without requiring destructive data augmentation operations that may compromise knowledge structure integrity. To preserve semantic consistency in learned representations, DECKT incorporates a graph structure constraint loss that leverages concept–question relationships to maintain appropriate similarities between related concepts in the embedding space. Furthermore, an adversarial training mechanism applies perturbations to embedding vectors, enhancing model robustness and generalization. Extensive experiments on benchmark datasets demonstrate that DECKT significantly outperforms existing state-of-the-art methods, validating the effectiveness of the proposed approach in alleviating representation challenges in sparse educational data.
| Original language | English |
|---|---|
| Article number | 685 |
| Journal | Entropy |
| Volume | 27 |
| Issue number | 7 |
| DOIs | |
| State | Published - Jul 2025 |
Keywords
- contrastive learning
- data mining
- deep learning
- graph neural network
- knowledge tracing