ERIC Number: EJ1379480
Record Type: Journal
Publication Date: 2023-Jun
Pages: 32
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-0735-6331
EISSN: EISSN-1541-4140
Available Date: N/A
Recurrent Neural Network-Fitnets: Improving Early Prediction of Student Performanceby Time-Series Knowledge Distillation
Murata, Ryusuke; Okubo, Fumiya; Minematsu, Tsubasa; Taniguchi, Yuta; Shimada, Atsushi
Journal of Educational Computing Research, v61 n3 p639-670 Jun 2023
This study helps improve the early prediction of student performance by RNN-FitNets, which applies knowledge distillation (KD) to the time series direction of the recurrent neural network (RNN) model. The RNN-FitNets replaces the teacher model in KD with "an RNN model with a long-term time-series in which the features during the entire course are inputted" and the student model in KD with "an RNN model with a short-term time-series in which only the features during the early stages are inputted." As a result, the RNN model in the early stage was trained to output the same results as the more accurate RNN model in the later stages. The experiment compared RNN-FitNets with a normal RNN model on a dataset of 296 university students in total. The results showed that RNN-FitNets can improve early prediction. Moreover, the SHAP value was employed to explain the contribution of the input features to the prediction results by RNN-FitNets. It was shown that RNN-FitNets can consider the future effects of the input features from the early stages of the course.
Descriptors: College Students, Academic Achievement, Prediction, Neurology, Models, Knowledge Level, At Risk Students
SAGE Publications. 2455 Teller Road, Thousand Oaks, CA 91320. Tel: 800-818-7243; Tel: 805-499-9774; Fax: 800-583-2665; e-mail: journals@sagepub.com; Web site: https://sagepub-com.bibliotheek.ehb.be
Publication Type: Journal Articles; Reports - Research
Education Level: Higher Education; Postsecondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A
Author Affiliations: N/A