ERIC Number: EJ1484702
Record Type: Journal
Publication Date: 2025-Apr
Pages: 12
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-1382-4996
EISSN: EISSN-1573-1677
Available Date: 2024-07-08
Exploring the Use of Rasch Modelling in "Common Content" Items for Multi-Site and Multi-Year Assessment
David Hope1; David Kluth1; Matthew Homer2; Avril Dewar1; Rikki Goddard-Fuller3; Alan Jaap1; Helen Cameron4
Advances in Health Sciences Education, v30 n2 p427-438 2025
Rasch modelling is a powerful tool for evaluating item performance, measuring drift in difficulty over time, and comparing students who sat assessments at different times or at different sites. Here, we use data from thirty UK medical schools to describe the benefits of Rasch modelling in quality assurance and the barriers to using it. Sixty "common content" multiple choice items were offered to all UK medical schools in 2016-17, and a further sixty in 2017-18, with five available in both years. Thirty medical schools participated, for sixty total datasets across two sessions, and 14,342 individual sittings. Schools selected items to embed in written assessment near the end of their programmes. We applied Rasch modelling to evaluate unidimensionality, model fit statistics and item quality, horizontal equating to compare performance across schools, and vertical equating to compare item performance across time. Of the sixty sittings, three provided non-unidimensional data, and eight violated goodness of fit measures. Item-level statistics identified potential improvements in item construction and provided quality assurance. Horizontal equating demonstrated large differences in scores across schools, while vertical equating showed item characteristics were stable across sessions. Rasch modelling provides significant advantages in model- and item- level reporting compared to classical approaches. However, the complexity of the analysis and the smaller number of educators familiar with Rasch must be addressed locally for a programme to benefit. Furthermore, due to the comparative novelty of Rasch modelling, there is greater ambiguity on how to proceed when a Rasch model identifies misfitting or problematic data.
Descriptors: Item Response Theory, Medical Schools, Foreign Countries, Quality Assurance, Test Items, Multiple Choice Tests, Medical Students
Springer. Available from: Springer Nature. One New York Plaza, Suite 4600, New York, NY 10004. Tel: 800-777-4643; Tel: 212-460-1500; Fax: 212-460-1700; e-mail: customerservice@springernature.com; Web site: https://link-springer-com.bibliotheek.ehb.be/
Publication Type: Journal Articles; Reports - Research
Education Level: Higher Education; Postsecondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Location: United Kingdom
Grant or Contract Numbers: N/A
Author Affiliations: 1The University of Edinburgh, Medical Education Unit, The Chancellor’s Building, College of Medicine and Veterinary Medicine, Edinburgh, Scotland, UK; 2University of Leeds, Leeds Institute of Medical Education, Leeds School of Medicine, Worsley Building, Woodhouse, Leeds, UK; 3The Christie NHS Foundation Trust, Christie Education, Manchester, UK; 4Aston University, Aston Medical School, Birmingham, UK

Peer reviewed
Direct link
