ERIC Number: ED639715
Record Type: Non-Journal
Publication Date: 2023
Pages: 177
Abstractor: As Provided
ISBN: 979-8-3806-0745-2
ISSN: N/A
EISSN: N/A
Available Date: N/A
Addressing Current Methodological Challenges in ILSA's Transition to Adaptive Testing
Montserrat Beatriz Valdivia Medinaceli
ProQuest LLC, Ph.D. Dissertation, Indiana University
My dissertation examines three current challenges of international large-scale assessments (ILSAs) associated with the transition from linear testing to an adaptive testing design. ILSAs are important for making comparisons among populations and informing countries about the quality of their educational systems. ILSA's results inform policymakers and foreign aid decisions, serve as a data source for secondary research, and inform the public, in general, about the state of educational systems around the world. However, as more developing economies participate in ILSAs, the cultural comparability of the test becomes a pressing design issue. Specifically, educational systems at the ends of the continuum are not well measured because items are either too difficult for low-performing countries or too easy for high-performing countries. Multistage testing (MST) and group adaptive testing (GAT) tailor test difficulty to examinees' proficiency. MST and GAT have been implemented to better measure proficiency at the ends of its continuum, especially for low- and high-performing countries. However, research gaps remain, given the challenges of implementing an adaptive design in ILSAs. My dissertation is a multiple-manuscript approach that addresses three separate and important challenges concerning ILSAs' measurement quality in the transition toward adaptive designs. The first paper aims to study the impact of item bias in country-level proficiency estimates when an MST is used. The second paper takes a critical stance in the transition to adaptive testing design when there is a mismatch between item difficulty and proficiency distribution of countries at the ends of the proficiency continuum. The third paper evaluates the performance of the root mean square deviation (RMSD) when the item local independence assumption of item response theory (IRT) models is violated. Findings reveal the significance of an adequate item pool that matches participants across the proficiency continuum. Effective routing mechanisms, like merit plus probabilistic misrouting, are essential, with bias-free items in early MST stages to mitigate bias effects on routing probabilities. Lastly, findings revealed that the RMSD cannot detect item bias when testlet effects are present. These insights serve as valuable guidance for practitioners and researchers aiming to enhance the fairness and comparability of ILSAs amid the growth of heterogeneous participants. [The dissertation citations contained here are published with the permission of ProQuest LLC. Further reproduction is prohibited without permission. Copies of dissertations may be obtained by Telephone (800) 1-800-521-0600. Web page: http://www.proquest.com.bibliotheek.ehb.be/en-US/products/dissertations/individuals.shtml.]
Descriptors: International Assessment, Achievement Tests, Adaptive Testing, Test Items, Test Bias, Test Construction, Item Response Theory, Error of Measurement
ProQuest LLC. 789 East Eisenhower Parkway, P.O. Box 1346, Ann Arbor, MI 48106. Tel: 800-521-0600; Web site: http://www.proquest.com.bibliotheek.ehb.be/en-US/products/dissertations/individuals.shtml
Publication Type: Dissertations/Theses - Doctoral Dissertations
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A
Author Affiliations: N/A