NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: ED652804
Record Type: Non-Journal
Publication Date: 2024-Feb-5
Pages: 51
Abstractor: As Provided
ISBN: N/A
ISSN: N/A
EISSN: N/A
Available Date: N/A
Should We Account for Classrooms? Analyzing Online Experimental Data with Student-Level Randomization
Grantee Submission
Emergent technologies present platforms for educational researchers to conduct randomized controlled trials (RCTs) and collect rich data on study students' performance, behavior, learning processes, and outcomes in authentic learning environments. As educational research increasingly uses methods and data collection from such platforms, it is necessary to consider the most appropriate ways to analyze this data to draw causal inferences from RCTs. Here, we examine whether and how analysis results are impacted by accounting for multilevel variance in samples from RCTs with student-level randomization. We propose and demonstrate a method that leverages auxiliary non-experimental "remnant" data collected within a learning platform to inform analysis decisions. Specifically, we compare five commonly-applied analysis methods to estimate treatment effects while accounting for, or ignoring, class-level factors and observed measures of confidence and accuracy to identify best practices under real-world conditions. We find that methods that account for groups as either fixed effects or random effects consistently outperform those that ignore group-level factors, even though randomization was applied at the student level. However, we found no meaningful differences between the use of fixed or random effects as a means to account for groups. We conclude that analyses of online experiments should account for the naturally-nested structure of students within classes, despite the notion that student-level randomization may alleviate group-level differences. Further, we demonstrate how to use remnant data to identify appropriate methods for analyzing experiments. These findings provide practical guidelines for researchers conducting RCTs in educational technologies to make more informed decisions when approaching analyses. [This paper will be published in the "Educational Technology Research and Development."]
Related Records: EJ1448022
Publication Type: Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: Institute of Education Sciences (ED); National Science Foundation (NSF), Graduate Research Fellowship Program (GRFP); Schmidt Futures
Authoring Institution: N/A
IES Funded: Yes
Grant or Contract Numbers: R305N230034; 1645629; 2331379
Data File: URL: https://osf.io/c8rj3/
Author Affiliations: N/A