NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ1322446
Record Type: Journal
Publication Date: 2021
Pages: 26
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-0969-594X
EISSN: N/A
Available Date: N/A
Assessing L2 English Speaking Using Automated Scoring Technology: Examining Automarker Reliability
Assessment in Education: Principles, Policy & Practice, v28 n4 p411-436 2021
Recent advances in machine learning have made automated scoring of learner speech widespread, and yet validation research that provides support for applying automated scoring technology to assessment is still in its infancy. Both the educational measurement and language assessment communities have called for greater transparency in describing scoring algorithms and research evidence about the reliability of automated scoring. This paper reports on a study that investigated the reliability of an automarker using candidate responses produced in an online oral English test. Based on 'limits of agreement' and multi-faceted Rasch analyses on automarker scores and individual examiner scores, the study found that the automarker, while exhibiting excellent internal consistency, was slightly more lenient than examiner fair average scores, particularly for low-proficiency speakers. Additionally, it was found that an automarker uncertainty measure termed Language Quality, which indicates the confidence of speech recognition, was useful for predicting automarker reliability and flagging abnormal speech.
Routledge. Available from: Taylor & Francis, Ltd. 530 Walnut Street Suite 850, Philadelphia, PA 19106. Tel: 800-354-1420; Tel: 215-625-8900; Fax: 215-207-0050; Web site: http://www.tandf.co.uk/journals
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Location: Europe
Grant or Contract Numbers: N/A
Author Affiliations: N/A