NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 8 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Shen, Zuchao; Kelcey, Benjamin – Journal of Experimental Education, 2022
Optimal design of multisite randomized trials leverages sampling costs to optimize sampling ratios and ultimately identify more efficient and powerful designs. Past implementations of the optimal design framework have assumed that costs of sampling units are equal across treatment conditions. In this study, we developed a more flexible optimal…
Descriptors: Randomized Controlled Trials, Sampling, Research Design, Statistical Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Shen, Zuchao; Kelcey, Benjamin – Journal of Research on Educational Effectiveness, 2022
Optimal sampling frameworks attempt to identify the most efficient sampling plans to achieve an adequate statistical power. Although such calculations are theoretical in nature, they are critical to the judicious and wise use of funding because they serve as important starting points that guide practical discussions around sampling tradeoffs and…
Descriptors: Sampling, Research Design, Randomized Controlled Trials, Statistical Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Shen, Zuchao; Kelcey, Benjamin – Journal of Educational and Behavioral Statistics, 2020
Conventional optimal design frameworks consider a narrow range of sampling cost structures that thereby constrict their capacity to identify the most powerful and efficient designs. We relax several constraints of previous optimal design frameworks by allowing for variable sampling costs in cluster-randomized trials. The proposed framework…
Descriptors: Sampling, Research Design, Randomized Controlled Trials, Statistical Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Spybrook, Jessaca; Kelcey, Benjamin; Dong, Nianbo – Journal of Educational and Behavioral Statistics, 2016
Recently, there has been an increase in the number of cluster randomized trials (CRTs) to evaluate the impact of educational programs and interventions. These studies are often powered for the main effect of treatment to address the "what works" question. However, program effects may vary by individual characteristics or by context,…
Descriptors: Randomized Controlled Trials, Statistical Analysis, Computation, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Dong, Nianbo; Kelcey, Benjamin; Spybrook, Jessaca – Journal of Experimental Education, 2018
Researchers are often interested in whether the effects of an intervention differ conditional on individual- or group-moderator variables such as children's characteristics (e.g., gender), teacher's background (e.g., years of teaching), and school's characteristics (e.g., urbanity); that is, the researchers seek to examine for whom and under what…
Descriptors: Statistical Analysis, Randomized Controlled Trials, Intervention, Effect Size
Peer reviewed Peer reviewed
Direct linkDirect link
Shen, Zuchao; Kelcey, Benjamin; Cox, Kyle T.; Zhang, Jiaqi – AERA Online Paper Repository, 2017
Recent studies show cluster randomized trials may be well powered to detect mediation or indirect effects in multilevel settings. However, literature has rarely provided guidance on designing cluster-randomized trials aim to assess indirect effects. In this study, we developed closed-form expression to estimate the variance of and the statistical…
Descriptors: Randomized Controlled Trials, Research Design, Context Effect, Statistical Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Kelcey, Benjamin; Dong, Nianbo; Spybrook, Jessaca; Cox, Kyle – Journal of Educational and Behavioral Statistics, 2017
Designs that facilitate inferences concerning both the total and indirect effects of a treatment potentially offer a more holistic description of interventions because they can complement "what works" questions with the comprehensive study of the causal connections implied by substantive theories. Mapping the sensitivity of designs to…
Descriptors: Statistical Analysis, Randomized Controlled Trials, Mediation Theory, Models
Peer reviewed Peer reviewed
Direct linkDirect link
Spybrook, Jessaca; Shi, Ran; Kelcey, Benjamin – International Journal of Research & Method in Education, 2016
This article examines the statistical precision of cluster randomized trials (CRTs) funded by the Institute of Education Sciences (IES). Specifically, it compares the total number of clusters randomized and the minimum detectable effect size (MDES) of two sets of studies, those funded in the early years of IES (2002-2004) and those funded in the…
Descriptors: Randomized Controlled Trials, Federal Aid, Public Agencies, Comparative Analysis