ERIC Number: EJ1045887
Record Type: Journal
Publication Date: 2014
Pages: 11
Abstractor: As Provided
ISBN: N/A
ISSN: EISSN-1531-7714
EISSN: N/A
Available Date: N/A
Editorial Changes and Item Performance: Implications for Calibration and Pretesting
Stoffel, Heather; Raymond, Mark R.; Bucak, S. Deniz; Haist, Steven A.
Practical Assessment, Research & Evaluation, v19 n14 Nov 2014
Previous research on the impact of text and formatting changes on test-item performance has produced mixed results. This matter is important because it is generally acknowledged that "any" change to an item requires that it be recalibrated. The present study investigated the effects of seven classes of stylistic changes on item difficulty, discrimination, and response time for a subset of 65 items that make up a standardized test for physician licensure completed by 31,918 examinees in 2012. One of two versions of each item (original or revised) was randomly assigned to examinees such that each examinee saw only two experimental items, with each item being administered to approximately 480 examinees. The stylistic changes had little or no effect on item difficulty or discrimination; however, one class of edits--changing an item from an open lead-in (incomplete statement) to a closed lead-in (direct question)--did result in slightly longer response times. Data for nonnative speakers of English were analyzed separately with nearly identical results. These findings have implications for the conventional practice of repretesting (or recalibrating) items that have been subjected to minor editorial changes.
Descriptors: Test Construction, Test Items, Standardized Tests, Physicians, Licensing Examinations (Professions), Multiple Choice Tests, Difficulty Level, Item Analysis, Item Response Theory, Revision (Written Composition), Editing, Statistical Analysis, English (Second Language), Pretests Posttests
Dr. Lawrence M. Rudner. e-mail: editor@pareonline.net; Web site: http://pareonline.net
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Assessments and Surveys: United States Medical Licensing Examination
Grant or Contract Numbers: N/A
Author Affiliations: N/A