NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
ERIC Number: ED672322
Record Type: Non-Journal
Publication Date: 2023-Apr
Pages: 20
Abstractor: As Provided
ISBN: N/A
ISSN: N/A
EISSN: N/A
Available Date: 0000-00-00
Heterogeneity of Item-Treatment Interactions Masks Complexity and Generalizability in Randomized Controlled Trials. EdWorkingPaper No. 23-754
Ishita Ahmed; Masha Bertling; Lijin Zhang; Andrew D. Ho; Prashant Loyalka; Hao Xue; Scott Rozelle; Benjamin W. Domingue
Annenberg Institute for School Reform at Brown University
Researchers use test outcomes to evaluate the effectiveness of education interventions across numerous randomized controlled trials (RCTs). Aggregate test data--for example, simple measures like the sum of correct responses--are compared across treatment and control groups to determine whether an intervention has had a positive impact on student achievement. We show that item-level data and psychometric analyses can provide information about treatment heterogeneity and improve design of future experiments. We apply techniques typically used in the study of Differential Item Functioning (DIF) to examine variation in the degree to which items show treatment effects. That is, are observed treatment effects due to generalized gains on the aggregate achievement measures or are they due to targeted gains on specific items? Based on our analysis of 7,244,566 item responses (265,732 students responding to 2,119 items) taken from 15 RCTs in low-and-middle-income countries, we find clear evidence for variation in gains across items. DIF analyses identify items that are highly sensitive to the interventions--in one extreme case, a single item drives nearly 40% of the observed treatment effect--as well as items that are insensitive. We also show that the variation of item-level sensitivity can have implications for the precision of effect estimates. Of the RCTs that have significant effect estimates, 41% have patterns of item-level sensitivity to treatment that allow for the possibility of a null effect when this source of uncertainty is considered. Our findings demonstrate how researchers can gain more insight regarding the effects of interventions via additional analysis of item-level test data.
Annenberg Institute for School Reform at Brown University. Brown University Box 1985, Providence, RI 02912. Tel: 401-863-7990; Fax: 401-863-1290; e-mail: annenberg@brown.edu; Web site: https://annenberg.brown.edu/
Related Records: ED659521
Publication Type: Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Authoring Institution: Annenberg Institute for School Reform at Brown University
Grant or Contract Numbers: N/A
Data File: URL: https://osf.io/mjrx3/
Author Affiliations: N/A