Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 1 |
| Since 2007 (last 20 years) | 5 |
Descriptor
| Program Effectiveness | 5 |
| Statistical Analysis | 5 |
| Effect Size | 4 |
| Research Design | 4 |
| Correlation | 3 |
| Educational Research | 3 |
| Intervention | 3 |
| Program Evaluation | 3 |
| Sample Size | 3 |
| Evaluation Research | 2 |
| Multivariate Analysis | 2 |
| More ▼ | |
Source
| Society for Research on… | 2 |
| Educational Evaluation and… | 1 |
| Journal of Experimental… | 1 |
| Journal of Research on… | 1 |
Author
| Spybrook, Jessaca | 5 |
| Bloom, Howard S. | 1 |
| Jones, Nathan | 1 |
| Kelcey, Ben | 1 |
| Phelps, Geoffrey | 1 |
| Raudenbush, Stephen W. | 1 |
| Westine, Carl | 1 |
| Zhang, Jiaqi | 1 |
Publication Type
| Reports - Research | 4 |
| Journal Articles | 3 |
| Numerical/Quantitative Data | 1 |
| Reports - Evaluative | 1 |
Education Level
| Elementary Education | 2 |
| Junior High Schools | 2 |
| Middle Schools | 2 |
| Secondary Education | 2 |
| Early Childhood Education | 1 |
| Elementary Secondary Education | 1 |
| High Schools | 1 |
| Preschool Education | 1 |
Audience
Location
| Texas | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Bloom, Howard S.; Spybrook, Jessaca – Journal of Research on Educational Effectiveness, 2017
Multisite trials, which are being used with increasing frequency in education and evaluation research, provide an exciting opportunity for learning about how the effects of interventions or programs are distributed across sites. In particular, these studies can produce rigorous estimates of a cross-site mean effect of program assignment…
Descriptors: Program Effectiveness, Program Evaluation, Sample Size, Evaluation Research
Spybrook, Jessaca – Journal of Experimental Education, 2014
The Institute of Education Sciences has funded more than 100 experiments to evaluate educational interventions in an effort to generate scientific evidence of program effectiveness on which to base education policy and practice. In general, these studies are designed with the goal of having adequate statistical power to detect the average…
Descriptors: Intervention, Educational Research, Research Methodology, Statistical Analysis
Westine, Carl; Spybrook, Jessaca – Society for Research on Educational Effectiveness, 2013
The capacity of the field to conduct power analyses for group randomized trials (GRTs) of educational interventions has improved over the past decade (Authors, 2009). However, a power analysis depends on estimates of design parameters. Hence it is critical to build the empirical base of design parameters for GRTs across a variety of outcomes and…
Descriptors: Randomized Controlled Trials, Research Design, Correlation, Program Effectiveness
Kelcey, Ben; Spybrook, Jessaca; Zhang, Jiaqi; Phelps, Geoffrey; Jones, Nathan – Society for Research on Educational Effectiveness, 2015
With research indicating substantial differences among teachers in terms of their effectiveness (Nye, Konstantopoulous, & Hedges, 2004), a major focus of recent research in education has been on improving teacher quality through professional development (Desimone, 2009; Institute of Educations Sciences [IES], 2012; Measures of Effective…
Descriptors: Teacher Effectiveness, Faculty Development, Program Design, Educational Research
Spybrook, Jessaca; Raudenbush, Stephen W. – Educational Evaluation and Policy Analysis, 2009
This article examines the power analyses for the first wave of group-randomized trials funded by the Institute of Education Sciences. Specifically, it assesses the precision and technical accuracy of the studies. The authors identified the appropriate experimental design and estimated the minimum detectable standardized effect size (MDES) for each…
Descriptors: Research Design, Research Methodology, Effect Size, Correlation

Peer reviewed
Direct link
