NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
ERIC Number: ED398237
Record Type: Non-Journal
Publication Date: 1995-Oct
Pages: 4
Abstractor: N/A
ISBN: N/A
ISSN: N/A
EISSN: N/A
Available Date: N/A
Basic Item Analysis for Multiple-Choice Tests. ERIC/AE Digest.
Kehoe, Jerard
This digest presents a list of recommendations for writing multiple-choice test items, based on psychometrics statistics are typically provided by a measurement, or test scoring, service, where tests are machine-scored or by testing software packages. Test makers can capitalize on the fact that "bad" items can be differentiated from "good" items psychometrically. Tests can be improved by maintaining a pool of "good" items from which future tests can be drawn. Once test items are identified as being appropriately written, the extent to which they discriminate among students must be determined. The degree to which they do discriminate is the basic measure of item quality for almost all multiple-choice items. Statistics usually provided by a test scoring service provide the information needed overlapping distracters. (Contains 8 references.) (SLD) Specific suggestions are given for readjusting recorded items to develop an item pool with specific content areas that can be used to construct homogeneous tests for a unified content area. (Contains 4 references.) (SLD)
ERIC Clearinghouse on Assessment and Evaluation, The Catholic University of America, Department of Education, O'Boyle Hall, Washington, DC 20064 (free).
Publication Type: ERIC Publications; ERIC Digests in Full Text
Education Level: N/A
Audience: N/A
Language: English
Sponsor: Office of Educational Research and Improvement (ED), Washington, DC.
Authoring Institution: ERIC Clearinghouse on Assessment and Evaluation, Washington, DC.
Grant or Contract Numbers: N/A
Author Affiliations: N/A