NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing all 8 results Save | Export
Spybrook, Jessaca; Zhang, Qi; Kelcey, Ben; Dong, Nianbo – Educational Evaluation and Policy Analysis, 2020
Over the past 15 years, we have seen an increase in the use of cluster randomized trials (CRTs) to test the efficacy of educational interventions. These studies are often designed with the goal of determining whether a program works, or answering the what works question. Recently, the goals of these studies expanded to include for whom and under…
Descriptors: Randomized Controlled Trials, Educational Research, Program Effectiveness, Intervention
Peer reviewed Peer reviewed
Direct linkDirect link
Lortie-Forgues, Hugues; Inglis, Matthew – Educational Researcher, 2019
There are a growing number of large-scale educational randomized controlled trials (RCTs). Considering their expense, it is important to reflect on the effectiveness of this approach. We assessed the magnitude and precision of effects found in those large-scale RCTs commissioned by the UK-based Education Endowment Foundation and the U.S.-based…
Descriptors: Randomized Controlled Trials, Educational Research, Effect Size, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Uwimpuhwe, Germaine; Singh, Akansha; Higgins, Steve; Coux, Mickael; Xiao, ZhiMin; Shkedy, Ziv; Kasim, Adetayo – Journal of Experimental Education, 2022
Educational stakeholders are keen to know the magnitude and importance of different interventions. However, the way evidence is communicated to support understanding of the effectiveness of an intervention is controversial. Typically studies in education have used the standardised mean difference as a measure of the impact of interventions. This…
Descriptors: Program Effectiveness, Intervention, Multivariate Analysis, Bayesian Statistics
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Shakeel, M. Danish; Anderson, Kaitlin P.; Wolf, Patrick J. – Society for Research on Educational Effectiveness, 2016
The objective of this meta-analysis is to rigorously assess the participant effects of private school vouchers, or in other words, to estimate the average academic impacts that the offer (or use) of a voucher has on a student. This review adds to the literature by being the first to systematically review all Randomized Control Trials (RCTs) in an…
Descriptors: Educational Vouchers, Private Schools, Meta Analysis, Program Effectiveness
Peer reviewed Peer reviewed
Direct linkDirect link
Weiss, Michael J.; Bloom, Howard S.; Verbitsky-Savitz, Natalya; Gupta, Himani; Vigil, Alma E.; Cullinan, Daniel N. – Journal of Research on Educational Effectiveness, 2017
Multisite trials, in which individuals are randomly assigned to alternative treatment arms within sites, offer an excellent opportunity to estimate the cross-site average effect of treatment assignment (intent to treat or ITT) "and" the amount by which this impact varies across sites. Although both of these statistics are substantively…
Descriptors: Randomized Controlled Trials, Evidence, Models, Intervention
Peer reviewed Peer reviewed
Direct linkDirect link
Cheung, Alan C. K.; Slavin, Robert E. – Educational Researcher, 2016
As evidence becomes increasingly important in educational policy, it is essential to understand how research design might contribute to reported effect sizes in experiments evaluating educational programs. A total of 645 studies from 12 recent reviews of evaluations of preschool, reading, mathematics, and science programs were studied. Effect…
Descriptors: Effect Size, Research Methodology, Research Design, Preschool Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Llosa, Lorena; Lee, Okhee; Jiang, Feng; Haas, Alison; O'Connor, Corey; Van Booven, Christopher D.; Kieffer, Michael J. – American Educational Research Journal, 2016
The authors evaluated the effects of P-SELL, a science curricular and professional development intervention for fifth-grade students with a focus on English language learners (ELLs). Using a randomized controlled trial design with 33 treatment and 33 control schools across three school districts in one state, we found significant and meaningfully…
Descriptors: Intervention, Science Education, Science Course Improvement Projects, Program Effectiveness
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Cheung, Alan; Slavin, Robert – Society for Research on Educational Effectiveness, 2016
As evidence-based reform becomes increasingly important in educational policy, it is becoming essential to understand how research design might contribute to reported effect sizes in experiments evaluating educational programs. The purpose of this study was to examine how methodological features such as types of publication, sample sizes, and…
Descriptors: Effect Size, Evidence Based Practice, Educational Change, Educational Policy