NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ976426
Record Type: Journal
Publication Date: 2012-Jul
Pages: 24
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-0265-5322
EISSN: N/A
Available Date: N/A
A Comparison of Two Scoring Methods for an Automated Speech Scoring System
Xi, Xiaoming; Higgins, Derrick; Zechner, Klaus; Williamson, David
Language Testing, v29 n3 p371-394 Jul 2012
This paper compares two alternative scoring methods--multiple regression and classification trees--for an automated speech scoring system used in a practice environment. The two methods were evaluated on two criteria: construct representation and empirical performance in predicting human scores. The empirical performance of the two scoring models is reported in Zechner, Higgins, Xi, & Williamson (2009), which discusses the development of the entire automated speech scoring system; the current paper shifts the focus to the comparison of the two scoring methods, elaborating both technical and substantive considerations and providing a reasoned argument for the trade-off between them. We concluded that a multiple regression model with expert weights was superior to the classification tree model. In addition to comparing the relative performance of the two models, we also evaluated the adequacy of the regression model for the intended use. In particular, the construct representation of the model was sufficiently broad to justify its use in a low-stakes application. The correlation of the model-predicted total test scores with human scores (r = 0.7) was also deemed acceptable for practice purposes. (Contains 4 notes, 1 figure and 12 tables.)
SAGE Publications. 2455 Teller Road, Thousand Oaks, CA 91320. Tel: 800-818-7243; Tel: 805-499-9774; Fax: 800-583-2665; e-mail: journals@sagepub.com; Web site: http://sagepub.com.bibliotheek.ehb.be
Publication Type: Journal Articles; Reports - Evaluative
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Assessments and Surveys: Test of English as a Foreign Language
Grant or Contract Numbers: N/A
Author Affiliations: N/A