TY - GEN
T1 - GASKT
T2 - 14th International Conference on Knowledge Science, Engineering and Management, KSEM 2021
AU - Wang, Mengdan
AU - Peng, Chao
AU - Yang, Rui
AU - Wang, Chenchao
AU - Chen, Yao
AU - Yu, Xiaohua
N1 - Publisher Copyright:
© 2021, Springer Nature Switzerland AG.
PY - 2021
Y1 - 2021
N2 - Knowledge tracking (KT) is a fundamental tool to customize personalized learning paths for students so that they can take charge of their own learning pace. The main task of KT is to model the learning state of the students, however the process is quite involved. First, due to the sparsity of real-world educational data, the previous KT models ignore the high-order information in question-skill; second, the long sequence of student interactions poses a demanding challenge for KT models when dealing with long-term dependencies, and the last, due to the complexity of the forgetting mechanism. To address these issues, in this paper, we propose a Graph-based Attentive Knowledge-Search Model for Knowledge Tracing (GASKT). The model divides problems and skills into two types of nodes, utilizing R-GCN to thoroughly incorporate the relevance of problem-skill through embedding propagation, which reduces the impact of sparse data. Besides, it employs the modified attention mechanism to address the long-term dependencies issue. For the attention weight score between questions, on the basis of using the scaled dot-product, the forgetting mechanism is fully considered. We conduct extensive experiments on several real-world benchmark datasets, and our GASKT outperforms the state-of-the-art KT models, with at least 1% AUC improvement.
AB - Knowledge tracking (KT) is a fundamental tool to customize personalized learning paths for students so that they can take charge of their own learning pace. The main task of KT is to model the learning state of the students, however the process is quite involved. First, due to the sparsity of real-world educational data, the previous KT models ignore the high-order information in question-skill; second, the long sequence of student interactions poses a demanding challenge for KT models when dealing with long-term dependencies, and the last, due to the complexity of the forgetting mechanism. To address these issues, in this paper, we propose a Graph-based Attentive Knowledge-Search Model for Knowledge Tracing (GASKT). The model divides problems and skills into two types of nodes, utilizing R-GCN to thoroughly incorporate the relevance of problem-skill through embedding propagation, which reduces the impact of sparse data. Besides, it employs the modified attention mechanism to address the long-term dependencies issue. For the attention weight score between questions, on the basis of using the scaled dot-product, the forgetting mechanism is fully considered. We conduct extensive experiments on several real-world benchmark datasets, and our GASKT outperforms the state-of-the-art KT models, with at least 1% AUC improvement.
KW - Attention
KW - Forgetting mechanism
KW - Knowledge tracing
KW - LSTM
KW - Relational graph convolutional networks
UR - https://www.scopus.com/pages/publications/85113764956
U2 - 10.1007/978-3-030-82136-4_22
DO - 10.1007/978-3-030-82136-4_22
M3 - 会议稿件
AN - SCOPUS:85113764956
SN - 9783030821357
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 268
EP - 279
BT - Knowledge Science, Engineering and Management - 14th International Conference, KSEM 2021, Proceedings
A2 - Qiu, Han
A2 - Zhang, Cheng
A2 - Fei, Zongming
A2 - Qiu, Meikang
A2 - Kung, Sun-Yuan
PB - Springer Science and Business Media Deutschland GmbH
Y2 - 14 August 2021 through 16 August 2021
ER -