ERIC Number: EJ1317443
Record Type: Journal
Publication Date: 2021
Pages: 19
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-1755-6031
EISSN: N/A
Available Date: N/A
Item Response Theory, Computer Adaptive Testing and the Risk of Self-Deception
Benton, Tom
Research Matters, n32 p82-100 Aut 2021
Computer adaptive testing is intended to make assessment more reliable by tailoring the difficulty of the questions a student has to answer to their level of ability. Most commonly, this benefit is used to justify the length of tests being shortened whilst retaining the reliability of a longer, non-adaptive test. Improvements due to adaptive testing are often estimated using reliability coefficients based on item response theory (IRT). However, these coefficients assume that the underlying IRT model completely fits the data. This article takes a different approach, based on comparing the predictive value of shortened versions of real assessments based on adaptive and non-adaptive approaches. The results show that, when explored in this way, the benefits from adaptive testing may not always be quite a large as hoped.
Descriptors: Risk, Item Response Theory, Computer Assisted Testing, Difficulty Level, Test Reliability, Comparative Analysis, Prediction, Test Length, Test Format, Foreign Countries
University of Cambridge Local Examinations Syndicate (Cambridge Assessment). The Triangle Building, Shaftesbury Road, Cambridge, United Kingdom CB2 8EA. Tel: +44-1223-553311; e-mail: info@cambridgeassessment.org.uk; Web site: https://www.cambridgeassessment.org.uk/our-research/all-published-resources/research-matters/
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Location: United Kingdom
Grant or Contract Numbers: N/A
Author Affiliations: N/A