NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ1486096
Record Type: Journal
Publication Date: 2025-Oct
Pages: 28
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-0265-5322
EISSN: EISSN-1477-0946
Available Date: 0000-00-00
Comparison of Traditional Machine Learning and Neural Network Approaches for Automated Scoring of Second Language English Essays
Language Testing, v42 n4 p369-396 2025
An increasing number of language testing companies are developing and deploying deep learning-based automated essay scoring systems (AES) to replace traditional approaches that rely on handcrafted feature extraction. However, there is hesitation to accept neural network approaches to automated essay scoring because the features are automatically extracted and are viewed as less transparent and score interpretation is opaque. In order to compare the two approaches systematically, this paper investigated the performance of five approaches to automated essay scoring using traditional machine learning models and neural network models (i.e., deep learning). The models were developed to assign scores to responses in the TOEFL11 learner corpus. Since the dataset and metrics were held static, the results are dependent on model selection, training, and hyperparameter adjustment to find the best fit for each model. Results indicate the performance of the models was similar in accuracy but differed in precision and agreement as measured with the quadratic weighted kappa metric. Performance with traditional models can increase as specific features are added that align with the scoring criteria. The findings are relevant for the discussion about transparency in artificial intelligence (AI) scoring models.
SAGE Publications. 2455 Teller Road, Thousand Oaks, CA 91320. Tel: 800-818-7243; Tel: 805-499-9774; Fax: 800-583-2665; e-mail: journals@sagepub.com; Web site: https://sagepub-com.bibliotheek.ehb.be
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Assessments and Surveys: Test of English as a Foreign Language
Grant or Contract Numbers: N/A
Author Affiliations: 1Columbia University, USA