NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ1460231
Record Type: Journal
Publication Date: 2025-Feb
Pages: 14
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-0018-9359
EISSN: EISSN-1557-9638
Available Date: 0000-00-00
Knowledge Tracing through Enhanced Questions and Directed Learning Interaction Based on Multigraph Embeddings in Intelligent Tutoring Systems
IEEE Transactions on Education, v68 n1 p43-56 2025
In recent years, knowledge tracing (KT) within intelligent tutoring systems (ITSs) has seen rapid development. KT aims to assess a student's knowledge state based on past performance and predict the correctness of the next question. Traditional KT often treats questions with different difficulty levels of the same concept as identical representations, limiting the effectiveness of question embedding. Additionally, higher-order semantic relationships between questions are overlooked. Graph models have been employed in KT to enhance question embedding representation, but they rarely consider the directed relationships between learning interactions. Therefore, this article introduces a novel approach, KT through Enhanced Questions and Directed Learning Interaction Based on multigraph embeddings in ITSs (MGEKT), to address these limitations. One channel enhances question embedding representation by capturing relationships between students, concepts, and questions. This channel defines two meta paths, facilitating the learning of high-order semantic relationships between questions. The other channel constructs a directed graph of learning interactions, leveraging graph attention convolution to illustrate their intricate relationships. A new gating mechanism is proposed to capture long-term dependencies and emphasize critical information when tracing students' knowledge states. Notably, MGEKT employs reverse knowledge distillation, transferring knowledge from two small models (student models) to a large model (teacher model). This knowledge distillation enhances the model's generalization performance and improves the perception of crucial information. In comparative evaluations across four datasets, MGEKT outperformed baselines, demonstrating its effectiveness in KT.
Institute of Electrical and Electronics Engineers, Inc. 445 Hoes Lane, Piscataway, NJ 08854. Tel: 732-981-0060; Web site: http://ieeexplore.ieee.org.bibliotheek.ehb.be/xpl/RecentIssue.jsp?punumber=13
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A
Author Affiliations: N/A