TY - GEN
T1 - GSCL-KT
T2 - 2025 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2025
AU - Li, Changlong
AU - Wang, Su
AU - Hu, Wenxin
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - Knowledge tracing models have long grappled with the dual challenges of data sparsity and the limited ability to capture group learning patterns. Current contrastive learning paradigms in knowledge tracing (e.g., CL4KT framework) primarily employ sequence augmentation strategies to alleviate data scarcity constraints. However, such approaches frequently compromise semantic coherence during the augmentation process while failing to account for inherent similarity patterns within learner cohorts. To overcome these limitations, this paper introduces the GSCL-KT model (Group Similarity Contrastive Learning for Knowledge Tracing), which, for the first time, incorporates a group-similarity-aware contrastive learning mechanism into the knowledge tracing domain. Unlike traditional approaches that rely on manual data augmentation, GSCL-KT dynamically identifies positive and negative sample pairs from educationally homogeneous groups, enabling the discovery of group-level cognitive patterns while maintaining semantic coherence. The proposed model incorporates several advanced optimization strategies, including the Talking-Heads attention mechanism for fine-grained interaction modeling, the ContraNorm method for feature distribution regularization, and a correlation network enhanced by label dependencies. Experimental results on four real-world educational datasets demonstrate that GSCL-KT consistently outperforms existing baseline models, achieving the highest AUC and competitive performance across metrics.
AB - Knowledge tracing models have long grappled with the dual challenges of data sparsity and the limited ability to capture group learning patterns. Current contrastive learning paradigms in knowledge tracing (e.g., CL4KT framework) primarily employ sequence augmentation strategies to alleviate data scarcity constraints. However, such approaches frequently compromise semantic coherence during the augmentation process while failing to account for inherent similarity patterns within learner cohorts. To overcome these limitations, this paper introduces the GSCL-KT model (Group Similarity Contrastive Learning for Knowledge Tracing), which, for the first time, incorporates a group-similarity-aware contrastive learning mechanism into the knowledge tracing domain. Unlike traditional approaches that rely on manual data augmentation, GSCL-KT dynamically identifies positive and negative sample pairs from educationally homogeneous groups, enabling the discovery of group-level cognitive patterns while maintaining semantic coherence. The proposed model incorporates several advanced optimization strategies, including the Talking-Heads attention mechanism for fine-grained interaction modeling, the ContraNorm method for feature distribution regularization, and a correlation network enhanced by label dependencies. Experimental results on four real-world educational datasets demonstrate that GSCL-KT consistently outperforms existing baseline models, achieving the highest AUC and competitive performance across metrics.
UR - https://www.scopus.com/pages/publications/105033156842
U2 - 10.1109/SMC58881.2025.11342864
DO - 10.1109/SMC58881.2025.11342864
M3 - 会议稿件
AN - SCOPUS:105033156842
T3 - Conference Proceedings - IEEE International Conference on Systems, Man and Cybernetics
SP - 5927
EP - 5932
BT - 2025 IEEE International Conference on Systems, Man, and Cybernetics
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 5 October 2025 through 8 October 2025
ER -