NotesFAQContact Us
Collection
Advanced
Search Tips
What Works Clearinghouse Rating
Showing 1 to 15 of 74 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Ralph Renger; Elias Samuels; Jessica Renger; Ellen Champagne – American Journal of Evaluation, 2025
This article presents the Renger System Test (RST) as a method for assessing whether a system evaluation approach is suitable for evaluating complex interventions. The RST has three criteria: (1) the intervention includes multiple components, (2) these components operate interdependently, and (3) their interdependence produces an outcome that no…
Descriptors: Program Evaluation, Evaluation Methods, Evaluation Criteria, Systems Approach
Peer reviewed Peer reviewed
Direct linkDirect link
Chen, Wei-Ren; Chen, Mei-Fang – Gifted Education International, 2020
The ultimate goal of gifted education programs is to cultivate students' competences through challenging, enriching, and engaging opportunities for talent development. The purpose of this review is to present two main approaches of enrichment programs for gifted learners in Taiwan: the programs following the law and the alternative programs…
Descriptors: Program Evaluation, Gifted, Talent, Talent Development
Peer reviewed Peer reviewed
Direct linkDirect link
Goldhaber, Dan; Koedel, Cory – American Educational Research Journal, 2019
In the summer of 2013, the National Council on Teacher Quality (NCTQ) issued public ratings of teacher education programs. We provide the first empirical examination of NCTQ ratings, beginning with a descriptive overview of the ratings and how they evolved from 2013--2016. We also report on results from an information experiment built around the…
Descriptors: Accountability, Teacher Education Programs, Intervention, Program Evaluation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2017
The What Works Clearinghouse (WWC) evaluates research studies that look at the effectiveness of education programs, products, practices, and policies, which the WWC calls "interventions." Many studies of education interventions make claims about impacts on students' outcomes. Some studies have designs that enable readers to make causal…
Descriptors: Program Design, Program Development, Program Effectiveness, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Taylor, Joseph A.; Davis, Elisabeth; Michaelson, Laura E. – Review of Research in Education, 2021
In this chapter, we describe and compare the standards for evidence used by three entities that review studies of education interventions: Blueprints for Healthy Youth Development, Social Programs that Work, and the What Works Clearinghouse. Based on direct comparisons of the evidence frameworks, we identify key differences in the level at which…
Descriptors: Evidence, Standards, Educational Research, Intervention
Peer reviewed Peer reviewed
Direct linkDirect link
Gill, Kamaldeep; Thompson-Hodgetts, Sandra; Rasmussen, Carmen – Journal of Occupational Therapy, Schools & Early Intervention, 2018
To evaluate the strength of evidence for the effectiveness, feasibility, and appropriateness of the Alert Program®. Multiple databases were systematically searched for peer-reviewed, English-language articles that evaluated the Alert Program®. Six articles met the inclusion criteria. The strength of evidence ranged from weak to moderate using the…
Descriptors: Program Evaluation, Program Effectiveness, Self Control, Occupational Therapy
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2017
The What Works Clearinghouse (WWC) Standards Briefs explain the rules the WWC uses to evaluate the quality of studies for practitioners, researchers, and policymakers. An aspect of a study is considered a confounding factor if it is not possible to tell whether the difference in outcomes is due to the intervention, the confounding factor, or both.…
Descriptors: Educational Research, Evaluation Criteria, Intervention, Research Reports
Peer reviewed Peer reviewed
Direct linkDirect link
Munter, Charles; Wilhelm, Anne Garrison; Cobb, Paul; Cordray, David S. – Journal of Research on Educational Effectiveness, 2014
This article draws on previously employed methods for conducting fidelity studies and applies them to an evaluation of an unprescribed intervention. We document the process of assessing the fidelity of implementation of the Math Recovery first-grade tutoring program, an unprescribed, diagnostic intervention. We describe how we drew on recent…
Descriptors: Intervention, Program Implementation, Mathematics Education, Educational Diagnosis
Peer reviewed Peer reviewed
Direct linkDirect link
Amanda J. Neitzel; Qiyang Zhang; Robert E. Slavin – Society for Research on Educational Effectiveness, 2021
Background: Over the years, the quantity and quality of educational research has been rapidly improving. This can be attributed to the growing call to use evidence of effectiveness in decision-making by policymakers and practitioners. In fact, evidence sufficient to establish programs as "small", "moderate", or…
Descriptors: Meta Analysis, Evidence Based Practice, Elementary Secondary Education, Educational Legislation
Peer reviewed Peer reviewed
Direct linkDirect link
Stockard, Jean; Wood, Timothy W. – American Journal of Evaluation, 2017
Most evaluators have embraced the goal of evidence-based practice (EBP). Yet, many have criticized EBP review systems that prioritize randomized control trials and use various criteria to limit the studies examined. They suggest this could produce policy recommendations based on small, unrepresentative segments of the literature and recommend a…
Descriptors: Best Practices, Evidence Based Practice, Criticism, Randomized Controlled Trials
Peer reviewed Peer reviewed
Direct linkDirect link
Johansson, Per; Lindahl, Erica – Evaluation Review, 2012
Objective: In this article, we estimate the effect of a multidisciplinary collaboration program on the length of sickness absence. The intention with the program was to avoid long-term sickness absence by providing an early and holistic evaluation of the sick-listed individuals' conditions. The target group was individuals who were at risk of…
Descriptors: Holistic Evaluation, Health Services, Chronic Illness, At Risk Persons
Peer reviewed Peer reviewed
Direct linkDirect link
Boud, David; Soler, Rebeca – Assessment & Evaluation in Higher Education, 2016
Sustainable assessment has been proposed as an idea that focused on the contribution of assessment to learning beyond the timescale of a given course. It was identified as an assessment that meets the needs of the present in terms of the demands of formative and summative assessment, but which also prepares students to meet their own future…
Descriptors: Sustainability, Higher Education, Formative Evaluation, Summative Evaluation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Sharp, Laurie A. – Texas Journal of Literacy Education, 2016
On December 10, 2015, the Every Student Succeeds Act of 2015 (ESSA) was signed by President Barack Obama and became the United States' current national education law (United States Department of Education [U.S. DOE], n.d.). The ESSA was a long overdue reauthorization of the Elementary and Secondary Education Act of 1965 (ESEA). Unlike previous…
Descriptors: Educational Legislation, Federal Legislation, Program Descriptions, Educational Policy
Peer reviewed Peer reviewed
Direct linkDirect link
Murray, Rowena; Cunningham, Everarda – Studies in Higher Education, 2011
Academics are expected to write for publication and meet publication targets in research assessment processes. These targets are set by national bodies and institutions, and they can be daunting for academics at the start of a research career. This article reports on an intervention designed to address this issue, writer's retreat, where academics…
Descriptors: Foreign Countries, Researchers, Intervention, Faculty Publishing
Peer reviewed Peer reviewed
Direct linkDirect link
Stockard, Jean – Current Issues in Education, 2010
A large body of literature documents the central importance of fidelity of program implementation in creating an internally valid research design and considering such fidelity in judgments of research quality. The What Works Clearinghouse (WWC) provides web-based summary ratings of educational innovations and is the only rating group that is…
Descriptors: Research Design, Educational Innovation, Program Implementation, Program Effectiveness
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5