ERIC Number: EJ1238506
Record Type: Journal
Publication Date: 2019-Dec
Pages: 16
Abstractor: As Provided
ISBN: N/A
ISSN: EISSN-2330-8516
EISSN: N/A
Available Date: N/A
Distractor Analysis for Multiple-Choice Tests: An Empirical Study with International Language Assessment Data. Research Report. ETS RR-19-39
Haberman, Shelby J.; Liu, Yang; Lee, Yi-Hsuan
ETS Research Report Series, Dec 2019
Distractor analyses are routinely conducted in educational assessments with multiple-choice items. In this research report, we focus on three item response models for distractors: (a) the traditional nominal response (NR) model, (b) a combination of a two-parameter logistic model for item scores and a NR model for selections of incorrect distractors, and (c) a model in which the item score satisfies a two-parameter logistic model and distractor selection and proficiency are conditionally independent, given that an incorrect response is selected. Model comparisons involve generalized residuals, information measures, scale scores, and reliability estimates. To illustrate the methodology, a study of an international assessment of proficiency of nonnative speakers of a single target language used to make high-stakes decisions compares the models under study.
Descriptors: Multiple Choice Tests, Scores, Test Reliability, High Stakes Tests, Language Usage, Decision Making, Educational Assessment, Models, Comparative Analysis, Language Proficiency, Second Language Learning, Item Response Theory, Goodness of Fit, International Assessment
Educational Testing Service. Rosedale Road, MS19-R Princeton, NJ 08541. Tel: 609-921-9000; Fax: 609-734-5410; e-mail: RDweb@ets.org; Web site: https://www.ets.org/research/policy_research_reports/ets
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A
Author Affiliations: N/A