ERIC Number: ED385579
Record Type: Non-Journal
Publication Date: 1991-Oct
Pages: 29
Abstractor: N/A
ISBN: N/A
ISSN: N/A
EISSN: N/A
Available Date: N/A
How Well Can We Equate Test Forms That Are Constructed by Examinees? Program Statistics Research.
Wainer, Howard; And Others
When an examination consists, in whole or in part, of constructed response items, it is a common practice to allow the examinee to choose among a variety of questions. This procedure is usually adopted so that the limited number of items that can be completed in the allotted time does not unfairly affect the examinee. This results in the de facto administration of several different test forms, where the exact structure of any particular form is determined by the examinee. When different forms are administered, a canon of good testing practice requires that those forms be equated to adjust for differences in their difficulty. When the items are chosen by the examinee, traditional equating procedures do not strictly apply. In this paper, how one might equate with an item response theory (IRT) framework is explored. The procedure is illustrated with data from the College Board's Advanced Placement Test in Chemistry taken by a sample of 18,431 examinees. Comparable scores can be produced in the context of choice to the extent that responses may be characterized with a unidimensional IRT model. Seven tables and five figures illustrate the discussion. (Contains 19 references.) (Author/SLD)
Publication Type: Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: Educational Testing Service, Princeton, NJ.
Identifiers - Assessments and Surveys: Advanced Placement Examinations (CEEB)
Grant or Contract Numbers: N/A
Author Affiliations: N/A