ERIC Number: ED513388
Record Type: Non-Journal
Publication Date: 2010
Pages: 11
Abstractor: ERIC
ISBN: N/A
ISSN: N/A
EISSN: N/A
Available Date: N/A
Treatment Effect Heterogeneity in a Science Professional Development Initiative: The Case for School Capacity
Bruch, Sarah; Grigg, Jeffrey; Hanselman, Paul
Society for Research on Educational Effectiveness
This study focuses on how the treatment effects of a teacher professional development initiative in science differed by school capacity. In other words, the authors are primarily concerned with treatment effect heterogeneity. As such, this paper complements ongoing evaluation of the average treatment effects of the initiative over time. The research question considered here is: Did existing school capacity account for heterogeneity in teacher and student outcomes? That is, do treatment effects differ for schools with low, average, or high capacity? Specifically, the authors consider two outcomes: teachers' reported adoption of the targeted curriculum and students' subsequent achievement scores on standardized science tests. Although their primary focus is on student outcomes, teacher behaviors are informative because they represent a necessary mechanism in the causal process that is likely influenced by school capacity. There are clear policy implications of the demonstrated importance and variability of school capacity. The results implore more attention to be paid to the school pre-conditions underlying educational interventions, particularly given that the average school in this study did not have the capacity to successfully respond to this intensive professional development initiative. There are two clear implications for educational evaluation. One is to direct attention to rigorous causal evaluation of school capacity building, especially given that capacity can trump the interventions more commonly subjected to experimental testing. Indeed, the current study's design does not allow the authors to make any casual claims about what works in that arena. The other implication is that more effectiveness evaluations should explicitly consider school capacity as an important mediating dimension. The authors have demonstrated in this study that school capacity as conceptualized in the school organizations tradition is a meaningful tool for opening up the "black box" of a randomized professional development evaluation and their methodology would be relatively easy to replicate. (Contains 1 table and 3 figures.)
Descriptors: Science Tests, Inservice Teacher Education, Science Instruction, Program Evaluation, Outcomes of Education, Institutional Characteristics, Educational Environment, Predictor Variables, Science Achievement, Adoption (Ideas), Science Curriculum, Curriculum Implementation, Teacher Behavior, Educational Policy, Intervention, Educational Assessment, Evaluation Methods, Barriers, Program Implementation, Urban Schools, Elementary School Students, Elementary School Teachers, Teacher Surveys, Comparative Analysis
Society for Research on Educational Effectiveness. 2040 Sheridan Road, Evanston, IL 60208. Tel: 202-495-0920; Fax: 202-640-4401; e-mail: inquiries@sree.org; Web site: http://www.sree.org
Publication Type: Reports - Research
Education Level: Adult Education; Elementary Education; Elementary Secondary Education; Grade 4; Grade 5; Postsecondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: Society for Research on Educational Effectiveness (SREE)
Identifiers - Location: California
Grant or Contract Numbers: N/A
Author Affiliations: N/A