ERIC Number: EJ1333447
Record Type: Journal
Publication Date: 2021-Dec
Pages: 14
Abstractor: As Provided
ISBN: N/A
ISSN: EISSN-1939-1382
EISSN: N/A
Available Date: N/A
Learning Automated Essay Scoring Models Using Item-Response-Theory-Based Scores to Decrease Effects of Rater Biases
IEEE Transactions on Learning Technologies, v14 n6 p763-776 Dec 2021
In automated essay scoring (AES), scores are automatically assigned to essays as an alternative to grading by humans. Traditional AES typically relies on handcrafted features, whereas recent studies have proposed AES models based on deep neural networks to obviate the need for feature engineering. Those AES models generally require training on a large dataset of graded essays. However, assigned grades in such a training dataset are known to be biased owing to effects of rater characteristics when grading is conducted by assigning a few raters in a rater set to each essay. Performance of AES models drops when such biased data are used for model training. Researchers in the fields of educational and psychological measurement have recently proposed item response theory (IRT) models that can estimate essay scores while considering effects of rater biases. This study, therefore, proposes a new method that trains AES models using IRT-based scores for dealing with rater bias within training data.
Descriptors: Essays, Scoring, Writing Evaluation, Item Response Theory, Scores, Models, Bias, Computer Software, Computer Assisted Testing, Grading
Institute of Electrical and Electronics Engineers, Inc. 445 Hoes Lane, Piscataway, NJ 08854. Tel: 732-981-0060; Web site: http://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=4620076
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A
Author Affiliations: N/A