NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20260
Since 20250
Since 2022 (last 5 years)1
Since 2017 (last 10 years)4
Since 2007 (last 20 years)15
Audience
Researchers1
Laws, Policies, & Programs
No Child Left Behind Act 20012
Showing all 15 results Save | Export
Weiss, Michael J.; Bloom, Howard S. – MDRC, 2022
What works to help community college students progress academically? This brief synthesizes 20 years of rigorous research by MDRC, presenting new evidence about key attributes of community college interventions that are positively related to larger impacts on students' academic progress. Findings: Findings are based on a synthesis of evidence from…
Descriptors: Community Colleges, Two Year College Students, Educational Research, Intervention
Peer reviewed Peer reviewed
Direct linkDirect link
Bloom, Howard S.; Spybrook, Jessaca – Journal of Research on Educational Effectiveness, 2017
Multisite trials, which are being used with increasing frequency in education and evaluation research, provide an exciting opportunity for learning about how the effects of interventions or programs are distributed across sites. In particular, these studies can produce rigorous estimates of a cross-site mean effect of program assignment…
Descriptors: Program Effectiveness, Program Evaluation, Sample Size, Evaluation Research
Bloom, Howard S.; Weiss, Michael J. – MDRC, 2018
The benefits of understanding variation apply on multiple levels. Local policymakers and practitioners need to know both the average impact of an intervention and its variation across settings to properly assess its likely benefits and risks for their jurisdictions. For social scientists, cross-site impact variation offers opportunities to learn…
Descriptors: Program Effectiveness, Research Methodology, Geographic Location, Intervention
Peer reviewed Peer reviewed
Direct linkDirect link
Raudenbush, Stephen W.; Bloom, Howard S. – American Journal of Evaluation, 2015
The present article provides a synthesis of the conceptual and statistical issues involved in using multisite randomized trials to learn about and from a distribution of heterogeneous program impacts across individuals and/or program sites. Learning "about" such a distribution involves estimating its mean value, detecting and quantifying…
Descriptors: Program Effectiveness, Randomized Controlled Trials, Statistical Distributions, Computation
Peer reviewed Peer reviewed
Direct linkDirect link
Reardon, Sean F.; Unlu, Fatih; Zhu, Pei; Bloom, Howard S. – Journal of Educational and Behavioral Statistics, 2014
We explore the use of instrumental variables (IV) analysis with a multisite randomized trial to estimate the effect of a mediating variable on an outcome in cases where it can be assumed that the observed mediator is the only mechanism linking treatment assignment to outcomes, an assumption known in the IV literature as the exclusion restriction.…
Descriptors: Statistical Bias, Statistical Analysis, Least Squares Statistics, Sampling
Peer reviewed Peer reviewed
Direct linkDirect link
Bloom, Howard S. – Journal of Research on Educational Effectiveness, 2012
In this article, the author shares his comments on statistical analysis for multisite trials, and focuses on the contribution of Stephen Raudenbush, Sean Reardon, and Takako Nomi to future research. Raudenbush, Reardon, and Nomi provide a major contribution to future research on variation in program impacts by showing how to use multisite trials…
Descriptors: Program Evaluation, Statistical Analysis, Computation, Program Effectiveness
Raudenbush, Stephen W.; Bloom, Howard S. – MDRC, 2015
The present paper, which is intended for a diverse audience of evaluation researchers, applied social scientists, and research funders, provides a broad overview of the conceptual and statistical issues involved in using multisite randomized trials to learn "about" and "from" variation in program effects across…
Descriptors: Program Effectiveness, Research Methodology, Statistical Analysis, Differences
Peer reviewed Peer reviewed
Direct linkDirect link
Bloom, Howard S.; Raudenbush, Stephen W.; Weiss, Michael J.; Porter, Kristin – Journal of Research on Educational Effectiveness, 2017
The present article considers a fundamental question in evaluation research: "By how much do program effects vary across sites?" The article first presents a theoretical model of cross-site impact variation and a related estimation model with a random treatment coefficient and fixed site-specific intercepts. This approach eliminates…
Descriptors: Evaluation Research, Program Evaluation, Welfare Services, Employment
Bloom, Howard S.; Weiland, Christina – MDRC, 2015
This paper uses data from the Head Start Impact Study (HSIS), a nationally representative multisite randomized trial, to quantify variation in effects of Head Start during 2002-2003 on children's cognitive and socio-emotional outcomes relative to the effects of other local alternatives, including parent care. We find that (1) treatment and control…
Descriptors: Program Evaluation, Program Effectiveness, Early Intervention, At Risk Students
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Bloom, Howard S.; Porter, Kristin E.; Weiss, Michael J.; Raudenbush, Stephen – Society for Research on Educational Effectiveness, 2013
To date, evaluation research and policy analysis have focused mainly on average program impacts and paid little systematic attention to their variation. Recently, the growing number of multi-site randomized trials that are being planned and conducted make it increasingly feasible to study "cross-site" variation in impacts. Important…
Descriptors: Research Methodology, Policy, Evaluation Research, Randomized Controlled Trials
Bloom, Howard S.; Michalopoulos, Charles – MDRC, 2010
This paper examines strategies for interpreting and reporting estimates of intervention effects for subgroups of a study sample. Specifically, the paper considers: why and how subgroup findings are important for applied research, the importance of pre-specifying sub- groups before analyses are conducted, the importance of using existing theory and…
Descriptors: Groups, Intervention, Statistical Significance, Hypothesis Testing
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Gamse, Beth C.; Bloom, Howard S.; Kemple, James J.; Jacob, Robin Tepper – National Center for Education Evaluation and Regional Assistance, 2008
This executive summary describes results of the "Reading First Impact Study: Interim Report." The report presents preliminary findings from the Reading First Impact Study, a congressionally mandated evaluation of the federal government initiative to help all children read at or above grade level by the end of third grade. The No Child…
Descriptors: Reading Programs, National Programs, Program Effectiveness, Reading Comprehension
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Gamse, Beth C.; Bloom, Howard S.; Kemple, James J.; Jacob, Robin Tepper – National Center for Education Evaluation and Regional Assistance, 2008
This report presents preliminary findings from the Reading First Impact Study, a congressionally mandated evaluation of the federal government initiative to help all children read at or above grade level by the end of third grade. The No Child Left Behind Act of 2001 (NCLB) established Reading First and mandated its evaluation. This document is…
Descriptors: Reading Programs, National Programs, Program Effectiveness, Reading Comprehension
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Bloom, Howard S.; Weiland, Christina – Society for Research on Educational Effectiveness, 2014
Head Start is the largest publicly funded preschool program in the U.S., and one of its primary goals is to improve the school readiness of low-income children. As has been widely reported, the first randomized trial of Head Start in the program's history found some evidence that it is achieving this goal. Receiving one year of Head Start had…
Descriptors: Early Intervention, Preschool Education, Early Childhood Education, School Readiness
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Garet, Michael S.; Cronen, Stephanie; Eaton, Marian; Kurki, Anja; Ludwig, Meredith; Jones, Wehmah; Uekawa, Kazuaki; Falk, Audrey; Bloom, Howard S.; Doolittle, Fred; Zhu, Pei; Sztejnberg, Laura – National Center for Education Evaluation and Regional Assistance, 2008
To help states and districts make informed decisions about the professional development (PD) they implement to improve reading instruction, the U.S. Department of Education commissioned the Early Reading PD Interventions Study to examine the impact of two research-based PD interventions for reading instruction: (1) a content-focused teacher…
Descriptors: Early Reading, Reading Instruction, Professional Development, Intervention