ERIC Number: ED677763
Record Type: Non-Journal
Publication Date: 2025-Oct-8
Pages: N/A
Abstractor: As Provided
ISBN: N/A
ISSN: N/A
EISSN: N/A
Available Date: 0000-00-00
Reducing Publication Bias in Meta Analyses by Excluding Low-Powered Studies
Gracie Hayes; Larry Hedges
Society for Research on Educational Effectiveness
Context: The notion that studies reporting non-significant results may be less likely to be published leads to a publication selection problem, which impacts meta-analytic effect size estimation by introducing publication bias. The damaging result of publication bias in a meta-analysis is the inflation of the magnitude of estimated effect sizes, essentially leading the researcher to overestimate the positive or negative effects of a given treatment. Therefore, the task of the meta-analyst is to understand what effect sizes would be available, and thus estimate an unbiased effect size, as if this publication selection phenomenon was not present. Therefore, methods have been developed to understand if publication selection is present in a collection of studies and to adjust for its effects. Objectives: We aim to develop a method that reduces the bias introduced by publication selection while still upholding the accuracy of meta-analytic effect size estimates. Our research incorporates two relevant statistical theory results. First, we rely on evidence that the dominating component of inaccuracy is bias, more than variance. Second, we take into account that studies with low statistical power are most affected by publication bias. Combining these two results, we hypothesize that there are situations in which excluding low-powered studies from a meta-analysis will reduce the bias in effect size estimation without losing accuracy. Methods: Statistical theory implies that publication bias has the most serious effect on studies that have low statistical power. Therefore, one approach to reducing publication bias is removing low-powered studies. We simulate collections of studies that vary in study sample size and generate one observed effect size per study. Since power depends on the unknown true effect size, our method uses the MLE of a truncated, extreme selection model to obtain a preliminary estimate of power. This estimate is used to exclude from the meta-analysis any studies with less than 80% power. Then we use the unadjusted average of adequately powered studies to estimate the true effect size. Results: Simulations show that our method dramatically reduces bias in effect size estimation. Although it also reduces the number of studies used to produce an estimate, essentially increasing variance, the mean squared error (MSE) is also reduced. For example, in cases where 30-70% of studies considered for a meta-analysis are low-powered, our method reduces bias by 70-85% while also cutting the MSE in half. Conclusion: Through this research, we see that low-powered studies contribute significantly to both bias and MSE when a meta-analysis is subjected to publication selection. Using our method of removing studies which are estimated to be low-powered can all but erase publication bias while also reducing the MSE. This work serves as both a caution to researchers on the importance of conducting adequately powered studies, while also providing a solution for meta-analysts to mitigate the effects of publication selection.
Society for Research on Educational Effectiveness. 2040 Sheridan Road, Evanston, IL 60208. Tel: 202-495-0920; e-mail: contact@sree.org; Web site: https://www.sree.org/
Publication Type: Information Analyses; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: Society for Research on Educational Effectiveness (SREE)
Grant or Contract Numbers: N/A
Author Affiliations: N/A

Peer reviewed
Direct link
