NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 3 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Shadish, William R.; Hedges, Larry V.; Horner, Robert H.; Odom, Samuel L. – National Center for Education Research, 2015
The field of education is increasingly committed to adopting evidence-based practices. Although randomized experimental designs provide strong evidence of the causal effects of interventions, they are not always feasible. For example, depending upon the research question, it may be difficult for researchers to find the number of children necessary…
Descriptors: Effect Size, Case Studies, Research Design, Observation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Cheung, Alan; Slavin, Robert – Society for Research on Educational Effectiveness, 2016
As evidence-based reform becomes increasingly important in educational policy, it is becoming essential to understand how research design might contribute to reported effect sizes in experiments evaluating educational programs. The purpose of this study was to examine how methodological features such as types of publication, sample sizes, and…
Descriptors: Effect Size, Evidence Based Practice, Educational Change, Educational Policy
Peer reviewed Peer reviewed
Direct linkDirect link
Topping, K. J.; Samuels, J.; Paul, T. – School Effectiveness and School Improvement, 2007
This study elaborates the "what works?" question by exploring the effects of variability in program implementation quality on achievement. Particularly, the effects on achievement of computerized assessment of reading were investigated, analyzing data on 51,000 students in Grades 1-12 who read over 3 million books. When minimum implementation…
Descriptors: Program Implementation, Achievement Gains, Reading Achievement, Independent Reading