ERIC Number: EJ1182178
Record Type: Journal
Publication Date: 2014
Pages: 17
Abstractor: As Provided
ISBN: N/A
ISSN: EISSN-2196-0739
EISSN: N/A
Available Date: N/A
Detecting Differential Item Functioning Using Generalized Logistic Regression in the Context of Large-Scale Assessments
Svetina, Dubravka; Rutkowski, Leslie
Large-scale Assessments in Education, v2 Article 4 2014
Background: When studying student performance across different countries or cultures, an important aspect for comparisons is that of score comparability. In other words, it is imperative that the latent variable (i.e., construct of interest) is understood and measured equivalently across all participating groups or countries, if our inferences regarding performance can be regarded as valid. Relatively fewer studies examined an item-level approach to measurement equivalence, particularly in settings where a large number of groups is included. Methods: This simulation study examines item-level differential item functioning (DIF) in the context of international large-scale assessment (ILSA) using a generalized logistic regression approach. Manipulated factors included the number of groups (10 or 20), magnitude of DIF, percent of DIF items, the nature of DIF, as well as the percent of affected groups with DIF. Results: Results suggested that the number of groups did not have an effect of the performance of the method (high power and low Type I error rates); however, other factors had impacted the accuracy. Specifically, Type I error rates were inflated in non-DIF conditions, while they were very conservative in all of the DIF conditions. Power was generally high, in particular in conditions where DIF magnitude was large, with one exception--in conditions where DIF was introduced in difficulty parameters and the percent of DIF items was 60. Conclusions: Our findings presented a mixed picture with respect to the performance of the generalized logistic regression method in the context of large number of groups with large sample sizes. In the presence of DIF, the method was successful in distinguishing between DIF and non-DIF, as evidenced by low Type I error and high power rates. On the other hand, however, in the absence of DIF, the method yielded increased Type I errors.
Descriptors: Test Items, Item Response Theory, Item Analysis, Regression (Statistics), Educational Assessment, International Assessment, Simulation, Error of Measurement, Elementary Secondary Education, Mathematics Achievement, Achievement Tests, Foreign Countries, Mathematics Tests, Science Achievement, Science Tests, Statistical Analysis
Springer. Available from: Springer Nature. 233 Spring Street, New York, NY 10013. Tel: 800-777-4643; Tel: 212-460-1500; Fax: 212-348-4505; e-mail: customerservice@springernature.com; Web site: https://link-springer-com.bibliotheek.ehb.be/
Publication Type: Journal Articles; Reports - Research
Education Level: Elementary Secondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Assessments and Surveys: Trends in International Mathematics and Science Study
Grant or Contract Numbers: N/A
Author Affiliations: N/A