Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 1 |
| Since 2007 (last 20 years) | 5 |
Descriptor
| Randomized Controlled Trials | 5 |
| Research Methodology | 5 |
| Statistical Significance | 5 |
| Statistical Analysis | 4 |
| Comparative Analysis | 3 |
| Evaluation Methods | 3 |
| Attrition (Research Studies) | 2 |
| Bias | 2 |
| Computation | 2 |
| Data Analysis | 2 |
| Effect Size | 2 |
| More ▼ | |
Source
| Society for Research on… | 2 |
| Journal of Education for… | 1 |
| Measurement in Physical… | 1 |
| What Works Clearinghouse | 1 |
Author
| Bloom, Howard S. | 1 |
| Buchanan, Taylor L. | 1 |
| Cheung, Alan C. K. | 1 |
| Lohse, Keith R. | 1 |
| Porter, Kristin E. | 1 |
| Raudenbush, Stephen | 1 |
| Slavin, Robert E. | 1 |
| Steiner, Peter M. | 1 |
| Weiss, Michael J. | 1 |
| Wong, Vivian | 1 |
Publication Type
| Reports - Research | 3 |
| Journal Articles | 2 |
| Guides - Non-Classroom | 1 |
| Reports - Descriptive | 1 |
Education Level
| Higher Education | 1 |
| Postsecondary Education | 1 |
Audience
Location
| Canada | 1 |
| United States | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Slavin, Robert E.; Cheung, Alan C. K. – Journal of Education for Students Placed at Risk, 2017
Large-scale randomized studies provide the best means of evaluating practical, replicable approaches to improving educational outcomes. This article discusses the advantages, problems, and pitfalls of these evaluations, focusing on alternative methods of randomization, recruitment, ensuring high-quality implementation, dealing with attrition, and…
Descriptors: Randomized Controlled Trials, Evaluation Methods, Recruitment, Attrition (Research Studies)
Steiner, Peter M.; Wong, Vivian – Society for Research on Educational Effectiveness, 2016
Despite recent emphasis on the use of randomized control trials (RCTs) for evaluating education interventions, in most areas of education research, observational methods remain the dominant approach for assessing program effects. Over the last three decades, the within-study comparison (WSC) design has emerged as a method for evaluating the…
Descriptors: Randomized Controlled Trials, Comparative Analysis, Research Design, Evaluation Methods
Bloom, Howard S.; Porter, Kristin E.; Weiss, Michael J.; Raudenbush, Stephen – Society for Research on Educational Effectiveness, 2013
To date, evaluation research and policy analysis have focused mainly on average program impacts and paid little systematic attention to their variation. Recently, the growing number of multi-site randomized trials that are being planned and conducted make it increasingly feasible to study "cross-site" variation in impacts. Important…
Descriptors: Research Methodology, Policy, Evaluation Research, Randomized Controlled Trials
Buchanan, Taylor L.; Lohse, Keith R. – Measurement in Physical Education and Exercise Science, 2016
We surveyed researchers in the health and exercise sciences to explore different areas and magnitudes of bias in researchers' decision making. Participants were presented with scenarios (testing a central hypothesis with p = 0.06 or p = 0.04) in a random order and surveyed about what they would do in each scenario. Participants showed significant…
Descriptors: Researchers, Attitudes, Statistical Significance, Bias
What Works Clearinghouse, 2011
With its critical assessments of scientific evidence on the effectiveness of education programs, policies, and practices (referred to as "interventions"), and a range of products summarizing this evidence, the What Works Clearinghouse (WWC) is an important part of the Institute of Education Sciences' strategy to use rigorous and relevant…
Descriptors: Standards, Access to Information, Information Management, Guides

Peer reviewed
Direct link
