NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 4 results Save | Export
Tipton, Elizabeth; Olsen, Robert B. – Educational Researcher, 2018
School-based evaluations of interventions are increasingly common in education research. Ideally, the results of these evaluations are used to make evidence-based policy decisions for students. However, it is difficult to make generalizations from these evaluations because the types of schools included in the studies are typically not selected…
Descriptors: Intervention, Educational Research, Decision Making, Evidence Based Practice
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Tipton, Elizabeth; Yeager, David; Iachan, Ronaldo – Society for Research on Educational Effectiveness, 2016
Questions regarding the generalizability of results from educational experiments have been at the forefront of methods development over the past five years. This work has focused on methods for estimating the effect of an intervention in a well-defined inference population (e.g., Tipton, 2013; O'Muircheartaigh and Hedges, 2014); methods for…
Descriptors: Behavioral Sciences, Behavioral Science Research, Intervention, Educational Experiments
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Tipton, Elizabeth; Fellers, Lauren; Caverly, Sarah; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Ruiz de Castillo, Veronica – Society for Research on Educational Effectiveness, 2015
Randomized experiments are commonly used to evaluate if particular interventions improve student achievement. While these experiments can establish that a treatment actually "causes" changes, typically the participants are not randomly selected from a well-defined population and therefore the results do not readily generalize. Three…
Descriptors: Site Selection, Randomized Controlled Trials, Educational Experiments, Research Methodology
Peer reviewed Peer reviewed
Direct linkDirect link
Taylor, Joseph A.; Roth, Kathleen; Wilson, Christopher D.; Stuhlsatz, Molly A. M.; Tipton, Elizabeth – Journal of Research on Educational Effectiveness, 2017
This article describes the effects of an analysis-of-practice professional development (PD) program on elementary school students' (Grades 4-6) science outcomes. The study design was a cluster-randomized trial with an analysis sample of 77 schools, 144 teachers and 2,823 students. Forty-two schools were randomly assigned to treatment, (88.5 hours)…
Descriptors: Faculty Development, Elementary School Science, Science Achievement, Elementary School Students