We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
GELT: A graph embeddings based lite-transformer for knowledge tracing.
- Authors
Liang, Zhijie; Wu, Ruixia; Liang, Zhao; Yang, Juan; Wang, Ling; Su, Jianyu
- Abstract
The development of intelligent education has led to the emergence of knowledge tracing as a fundamental task in the learning process. Traditionally, the knowledge state of each student has been determined by assessing their performance in previous learning activities. In recent years, Deep Learning approaches have shown promising results in capturing complex representations of human learning activities. However, the interpretability of these models is often compromised due to the end-to-end training strategy they employ. To address this challenge, we draw inspiration from advancements in graph neural networks and propose a novel model called GELT (Graph Embeddings based Lite-Transformer). The purpose of this model is to uncover and understand the relationships between skills and questions. Additionally, we introduce an energy-saving attention mechanism for predicting knowledge states that is both simple and effective. This approach maintains high prediction accuracy while significantly reducing computational costs compared to conventional attention mechanisms. Extensive experimental results demonstrate the superior performance of our proposed model compared to other state-of-the-art baselines on three publicly available real-world datasets for knowledge tracking.
- Subjects
GRAPH neural networks; LEARNING; RETINAL blood vessels; DEEP learning; PRIOR learning; FETAL monitoring
- Publication
PLoS ONE, 2024, Vol 19, Issue 5, p1
- ISSN
1932-6203
- Publication type
Article
- DOI
10.1371/journal.pone.0301714