ERIC Number: ED658549
Record Type: Non-Journal
Publication Date: 2022-Sep-22
Pages: N/A
Abstractor: As Provided
ISBN: N/A
ISSN: N/A
EISSN: N/A
Available Date: N/A
Impact of Video-Based Analysis-of-Practice Professional Learning: Comparing Online and Face-to-Face Models
Susan Kowalski; Amy Belcastro; Connie Hvidsten; Guy Ollison; Karen Askinas; Renee De Vaul; Gillian Roehrig
Society for Research on Educational Effectiveness
Background/Context: Video-based professional learning (PL) models for math and science teachers have gained prominence over the past two decades (Borko, Koellner, Jacobs, & Seago, 2011; Sherin, 2003), with demonstrated impacts on teacher and student outcomes (Sun & van Es, 2015; Kersting, Givvin, Sotelo, & Stigler, 2010; Kersting, Givvin, Thompson, Santagata, & Stigler, 2012; Seago, Jacobs, Heck, Nelson, & Malzahn, 2014). "Science Teachers Learning from Lesson Analysis" (STeLLA) is one such model. STeLLA, a face-to-face analysis-of-practice PL model for elementary science teachers, has demonstrated positive impacts on teachers and students alike in a cluster randomized trial (Taylor et al., 2017; Roth et al., 2019). What Works Clearinghouse identified STeLLA as a program that demonstrates positive impact on students. The face-to-face STeLLA model is effective, but costly, with a two-week summer institute and eight small-group sessions throughout the school year. The need for STeLLA facilitators to travel for the PL restricts the face-to-face model to large urban and suburban school districts that can afford the expense. It also precludes participation by some teachers who may find it difficult to attend the summer institute. However, face-to-face PL more easily allows teachers to establish high levels of trust essential to analyzing videos of their peers (van Es, 2012; Zhang, Lundeberg, Koehler, & Eberhardt, 2011). From a policy perspective, an online STeLLA PL model would have several benefits. Teachers could enroll regardless of their district's location and teachers can participate for 6 flexible hours per week over the entire summer, rather than during a more intensive two full weeks. Furthermore, with no travel or food expenses, the total cost of the PL can be substantially reduced. Although the potential benefits of an online STeLLA model are enticing, a question unanswered was whether it is possible to achieve similar impacts on students with an online STeLLA model compared to the face-to-face model. If the impacts wane substantially in an online offering, then the cost savings and practicality of the online version would be less compelling. Purpose/Objective/Research Question: The purpose of this study was to design and test a fully online version of the STeLLA PL model with the overarching question, "Are the online and face-to-face STeLLA models similarly effective?" We want to determine if the online version is a viable option for expanding reach of the highly effective face-to-face STeLLA PL. We hypothesized that there will be little or no reduction in impact of STeLLA on student outcomes when shifting from the face-to-face to online models. Setting: The research took place online in the Canvas learning management system, through Zoom videoconferences, and in the classrooms of teachers across 12 states. Teachers participated in online PL, enacted STeLLA lessons and instructional strategies with their students in their own classes, filmed their instruction, and administered student assessments. We leveraged data from the cluster randomized trial of the face-to-face model for comparison, collected in Colorado between 2011 and 2013. Population/Participants/Subjects: Table 1 describes the participants. Intervention/Program/Practice: Both face-to-face and online STeLLA PL models required approximately 90 hours of PL across a calendar year. Both had similar instructional goals in the summer, fall, and winter components. Table 2 provides more detail on the components of each intervention and how they were administered. Research Design: We used a quasi-experimental research design, comparing new data collected from the online STeLLA PL program to data collected in the original face-to-face STeLLA PL cluster randomized trial. We used identical curriculum modules and instruments across the two versions to minimize bias. The assessment developer was external to the online project, and PL facilitators did not have access to the student assessment. Data Collection and Analysis: Student assessments (person reliability = 0.82; person separation = 2.11) were paper-based. We analyzed the impacts of the modality of STeLLA PL on student learning using a two-level model, with students nested within teachers. Student pretest data served as a level 1 covariate, mean pretest score by teacher was a level 2 covariate, and treatment condition was included as a level 2 explanatory variable. Figure 1 shows the analytic models and were identical to those used in the original CRT (Taylor et al., 2017). We also estimated the effect size for student impacts ([delta subscript T], Figure 2). Findings/Results: We found no significant difference in student achievement between groups. Students of teachers who experienced face-to-face STeLLA were not significantly different at posttest from students of teachers who experienced online STeLLA (p = 0.185). Table 3 provides the descriptive statistics and Table 4 provides model-based results. Conclusions: Video-based analysis-of-practice models are often complex and intensive, requiring skilled facilitation to support teachers in "noticing" key elements of instruction captured on video. It also requires high levels of trust: teachers don't engage in constructive critique of other teachers' videos unless a high degree of trust has been established (Beisiegel, Mitchell, & Hill, 2018). Nevertheless, these data suggest it is possible to construct a fully online version of a complex video-based analysis-of-practice PL model that retains its impact on students while providing greater accessibility for teachers and at a lower cost for districts. This study suggests it may be possible for PL developers and providers to create more cost-effective and scalable online versions of complex PL models. A limitation to the study is the lack of random assignment. It is unclear whether teachers in each modality preferred that modality. Another limitation relates to timing of the online STeLLA study. The online PL began in the summer of 2020. Students receiving instruction from teachers in the online STeLLA PL model were learning science during the most disrupted school year we have ever witnessed. We restricted student data collection to those students who learned science in a face-to-face setting (similar to the experience of students in the original CRT of the face-to-face STeLLA PL). That is, while we wanted to test whether "teachers" could learn in an online format, we did "not" intend to test whether students could learn in an online format. Further study is needed.
Descriptors: Electronic Learning, In Person Learning, Comparative Analysis, Video Technology, Evaluation, Teacher Education, Elementary School Science, Science Teachers, Elementary School Teachers, Expenditures, Program Development, Summer Programs, Program Effectiveness
Society for Research on Educational Effectiveness. 2040 Sheridan Road, Evanston, IL 60208. Tel: 202-495-0920; e-mail: contact@sree.org; Web site: https://www.sree.org/
Publication Type: Reports - Research
Education Level: Elementary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: Society for Research on Educational Effectiveness (SREE)
Grant or Contract Numbers: N/A
Author Affiliations: N/A