NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
ERIC Number: ED346161
Record Type: Non-Journal
Publication Date: 1992-Apr
Pages: 19
Abstractor: N/A
ISBN: N/A
ISSN: N/A
EISSN: N/A
Available Date: N/A
How Review Options and Administration Modes Influence Scores on Computerized Vocabulary Tests.
Vispoel, Walter P.; And Others
The effects of review options (the opportunity for examinees to review and change answers) on the magnitude, reliability, efficiency, and concurrent validity of scores obtained from three types of computerized vocabulary tests (fixed item, adaptive, and self-adapted) were studied. Subjects were 97 college students at a large midwestern university who each completed one of the vocabulary tests and several measures of attitudes about review, item difficulty, and test anxiety. Review modestly enhanced test performance, slightly decreased measurement precision, moderately increased total testing time, affected concurrent validity, and was strongly favored by examinees. Computerized tests do not necessarily yield equivalent results, and such tests may have to be equated to ensure fair use of test scores. Differences in performance favoring paper-and-pencil tests in some prior studies occurred because review options were excluded from the computerized tests. Results for administration mode are inconclusive. Compared to the other types, the fixed-item test yielded the least desirable results because scores were lowest, least reliable, and most susceptible to test anxiety effects. The choice between self-adapted and adaptive tests seems to depend on examinee anxiety level. Item difficulty suggestion, rather than answer feedback, is the predominant factor facilitating performance on self-adapted tests. Included are 3 tables, 4 graphs, and 38 references. (SLD)
Publication Type: Reports - Research; Speeches/Meeting Papers
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A
Author Affiliations: N/A