NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
Assessments and Surveys
Stanford Binet Intelligence…1
What Works Clearinghouse Rating
Showing 1 to 15 of 21 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2022
Education decisionmakers need access to the best evidence about the effectiveness of education interventions, including practices, products, programs, and policies. It can be difficult, time consuming, and costly to access and draw conclusions from relevant studies about the effectiveness of interventions. The What Works Clearinghouse (WWC)…
Descriptors: Program Evaluation, Program Effectiveness, Standards, Educational Research
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2020
The What Works Clearinghouse (WWC) systematic review process is the basis of many of its products, enabling the WWC to use consistent, objective, and transparent standards and procedures in its reviews, while also ensuring comprehensive coverage of the relevant literature. The WWC systematic review process consists of five steps: (1) Developing…
Descriptors: Educational Research, Evaluation Methods, Research Reports, Standards
Steiner, Peter M.; Kim, Yongnam; Hall, Courtney E.; Su, Dan – Sociological Methods & Research, 2017
Randomized controlled trials (RCTs) and quasi-experimental designs like regression discontinuity (RD) designs, instrumental variable (IV) designs, and matching and propensity score (PS) designs are frequently used for inferring causal effects. It is well known that the features of these designs facilitate the identification of a causal estimand…
Descriptors: Graphs, Causal Models, Quasiexperimental Design, Randomized Controlled Trials
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2017
The What Works Clearinghouse (WWC) systematic review process is the basis of many of its products, enabling the WWC to use consistent, objective, and transparent standards and procedures in its reviews, while also ensuring comprehensive coverage of the relevant literature. The WWC systematic review process consists of five steps: (1) Developing…
Descriptors: Educational Research, Evaluation Methods, Research Reports, Standards
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Kim, Yongnam; Steiner, Peter M.; Hall, Courtney E.; Su, Dan – Society for Research on Educational Effectiveness, 2016
Experimental and quasi-experimental designs play a central role in estimating cause-effect relationships in education, psychology, and many other fields of the social and behavioral sciences. This paper presents and discusses the causal graphs of experimental and quasi-experimental designs. For quasi-experimental designs the authors demonstrate…
Descriptors: Graphs, Quasiexperimental Design, Randomized Controlled Trials, Regression (Statistics)
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2020
The What Works Clearinghouse (WWC) is an initiative of the U.S. Department of Education's Institute of Education Sciences (IES), which was established under the Education Sciences Reform Act of 2002. It is an important part of IES's strategy to use rigorous and relevant research, evaluation, and statistics to improve the nation's education system.…
Descriptors: Educational Research, Evaluation Methods, Evidence, Statistical Significance
Peer reviewed Peer reviewed
Direct linkDirect link
Hitchcock, John H.; Johnson, R. Burke; Schoonenboom, Judith – Research in the Schools, 2018
The central purpose of this article is to provide an overview of the many ways in which special educators can generate and think about causal inference to inform policy and practice. Consideration of causality across different lenses can be carried out by engaging in multiple method and mixed methods ways of thinking about inference. This article…
Descriptors: Causal Models, Statistical Inference, Special Education, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Kim, Yongnam; Steiner, Peter – Educational Psychologist, 2016
When randomized experiments are infeasible, quasi-experimental designs can be exploited to evaluate causal treatment effects. The strongest quasi-experimental designs for causal inference are regression discontinuity designs, instrumental variable designs, matching and propensity score designs, and comparative interrupted time series designs. This…
Descriptors: Quasiexperimental Design, Causal Models, Statistical Inference, Randomized Controlled Trials
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Fleisch, Brahm; Taylor, Stephen; Schöer, Volker; Mabogoane, Thabo – South African Journal of Education, 2017
This article illustrates the value of large-scale impact evaluations with counterfactual components. It begins by exploring the limitations of small-scale impact studies, which do not allow reliable inference to a wider population or which do not use valid comparison groups. The paper then describes the design features of a recent large-scale…
Descriptors: Randomized Controlled Trials, Reading Programs, Educational Improvement, Improvement Programs
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Schochet, Peter Z. – National Center for Education Evaluation and Regional Assistance, 2017
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies. The impact estimators are derived using the building blocks of experimental designs with minimal assumptions, and have good statistical properties. The methods apply to randomized controlled trials (RCTs) and…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Kautz, Tim; Schochet, Peter Z.; Tilley, Charles – National Center for Education Evaluation and Regional Assistance, 2017
A new design-based theory has recently been developed to estimate impacts for randomized controlled trials (RCTs) and basic quasi-experimental designs (QEDs) for a wide range of designs used in social policy research (Imbens & Rubin, 2015; Schochet, 2016). These methods use the potential outcomes framework and known features of study designs…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Schochet, Peter Z. – National Center for Education Evaluation and Regional Assistance, 2017
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies (Imbens and Rubin, 2015; Schochet, 2015, 2016). The estimators are derived using the building blocks of experimental designs with minimal assumptions, and are unbiased and normally distributed in large samples…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Wolbers, Kimberly; Dostal, Hannah; Graham, Steve; Branum-Martin, Lee; Kilpatrick, Jennifer; Saulsburry, Rachel – Grantee Submission, 2018
A quasi-experimental study was conducted to examine the impact of Strategic and Interactive Writing Instruction on 3rd-5th grade deaf and hard of hearing students' writing and written language compared to a business-as-usual condition (treatment group N = 41, comparison group N = 22). A total of 18 hours of instruction was provided for each of two…
Descriptors: Elementary School Students, Grade 3, Grade 4, Grade 5
Peer reviewed Peer reviewed
Direct linkDirect link
Deater-Deckard, Kirby – International Journal of Behavioral Development, 2016
Most of the individual difference variance in the population is found "within" families, yet studying the processes causing this variation is difficult due to confounds between genetic and nongenetic influences. Quasi-experiments can be used to test hypotheses regarding environment exposure (e.g., timing, duration) while controlling for…
Descriptors: Quasiexperimental Design, Genetics, Short Term Memory, Individual Differences
Peer reviewed Peer reviewed
Direct linkDirect link
Cheung, Alan C. K.; Slavin, Robert E. – Educational Researcher, 2016
As evidence becomes increasingly important in educational policy, it is essential to understand how research design might contribute to reported effect sizes in experiments evaluating educational programs. A total of 645 studies from 12 recent reviews of evaluations of preschool, reading, mathematics, and science programs were studied. Effect…
Descriptors: Effect Size, Research Methodology, Research Design, Preschool Evaluation
Previous Page | Next Page »
Pages: 1  |  2