NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20260
Since 20250
Since 2022 (last 5 years)0
Since 2017 (last 10 years)2
Since 2007 (last 20 years)8
Audience
Researchers2
Location
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 19 results Save | Export
S. Stanley Young; Warren Kindzierski; David Randall – National Association of Scholars, 2021
"Shifting Sands: Unsound Science and Unsafe Regulation" examines how irreproducible science affects select areas of government policy and regulation governed by different federal agencies. This first report on "PM[subscript 2.5] Regulation" focuses on irreproducible research in the field of environmental epidemiology, which…
Descriptors: Public Policy, Federal Regulation, Public Agencies, Epidemiology
Peer reviewed Peer reviewed
Direct linkDirect link
Polanin, Joshua R.; Pigott, Terri D. – Research Synthesis Methods, 2015
Meta-analysis multiplicity, the concept of conducting multiple tests of statistical significance within one review, is an underdeveloped literature. We address this issue by considering how Type I errors can impact meta-analytic results, suggest how statistical power may be affected through the use of multiplicity corrections, and propose how…
Descriptors: Meta Analysis, Statistical Significance, Error Patterns, Research Methodology
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Sahragard, Rahman; Yazdanpanahi, Solmaz – Online Submission, 2017
Engagement markers (hereafter, EMs) are crucial interpersonal devices to interact with readers through texts. However, little is known about the differences of EMs use in Humanities and Science journal research articles (hereafter, RAs), as well as the changes in markers use over the passage of time. The present study provides a quantitative and…
Descriptors: Journal Articles, Humanities, Science Education, Comparative Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Samson, Patricia L. – Journal of Social Work Education, 2016
In a meta-analytic review of critical thinking in social work education, findings revealed variability in research designs, methods, and subsequent findings. The 10 studies reviewed assessed different components of critical thinking and highlighted different potential moderator variables. Although there are significant limitations to all the…
Descriptors: Critical Thinking, Social Work, Meta Analysis, Research Design
Peer reviewed Peer reviewed
Direct linkDirect link
Miller, Andrew – Journal of Teaching in Physical Education, 2015
The purpose of this systematic review was to investigate the weight of scientific evidence regarding student outcomes (physical, cognitive and affective) of a Game Centered Approach (GCA) when the quality of a study was taken into account in the interpretation of collective findings. A systematic search of five electronic databases (Sports…
Descriptors: Teaching Methods, Literature Reviews, Educational Games, Children
Peer reviewed Peer reviewed
Direct linkDirect link
Manolov, Rumen; Solanas, Antonio – Psychological Methods, 2012
There is currently a considerable diversity of quantitative measures available for summarizing the results in single-case studies. Given that the interpretation of some of them is difficult due to the lack of established benchmarks, the current article proposes an approach for obtaining further numerical evidence on the importance of the results,…
Descriptors: Sampling, Probability, Statistical Significance, Case Studies
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Kidron, Yael; Lindsay, Jim – Regional Educational Laboratory Appalachia, 2014
REL Appalachia conducted a systematic review of the research evidence on the effects of increased learning time. After screening more than 7,000 studies, REL Appalachia identified 30 that met the most rigorous standards for research. A review of those 30 studies found that increased learning time does not always produce positive results. However,…
Descriptors: Time Factors (Learning), Time on Task, Meta Analysis, Standards
Hetrick, Sam – 1999
Magnitude of effect (ME) statistics are an important alternative to statistical significance. Why methodologists encourage the use of ME indices as interpretation aids is explained, and different types of ME statistics are discussed. The basic concepts underlying effect size measures are reviewed, and how to compute them from published reports…
Descriptors: Computation, Effect Size, Meta Analysis, Research Methodology
Becker, Betsy Jane – 1984
Power is an indicator of the ability of a statistical analysis to detect a phenomenon that does in fact exist. The issue of power is crucial for social science research because sample size, effects, and relationships studied tend to be small and the power of a study relates directly to the size of the effect of interest and the sample size.…
Descriptors: Effect Size, Hypothesis Testing, Meta Analysis, Power (Statistics)
Peer reviewed Peer reviewed
Schmidt, Frank; Hunter, John E. – Evaluation and the Health Professions, 1995
It is argued that point estimates of effect sizes and confidence intervals around these point estimates are more appropriate statistics for individual studies than reliance on statistical significance testing and that meta-analysis is appropriate for analysis of data from multiple studies. (SLD)
Descriptors: Effect Size, Estimation (Mathematics), Knowledge Level, Meta Analysis
Nix, Thomas W.; Barnette, J. Jackson – Research in the Schools, 1998
Reviews null hypothesis statistical significance testing (NHST) in its historical context and concludes that workable alternatives to NHST are available. Among suggested alternatives, effect magnitude measures, replication techniques, and meta-analytic techniques are discussed. (SLD)
Descriptors: Educational Research, Effect Size, Hypothesis Testing, Meta Analysis
Rosenthal, Robert – 1989
An overview of the state of the art in psychological research is presented, with an emphasis on the attention given to effect sizes. The acceptance of small effect sizes for biomedical research is contrasted with the rejection of similar effect sizes for psychological research. The Binomial Effect Size Display is used to depict the practical…
Descriptors: Effect Size, Mathematical Models, Meta Analysis, Psychological Studies
Brewer, Robert A. – Online Submission, 2007
This manuscript examines the practical differences between quantitative and qualitative inquiry by comparing the differences between one article from each paradigm. Quantitative research differs greatly from qualitative inquiry in purpose, assumptions, methodology, and representation. While quantitative research has been the dominant paradigm for…
Descriptors: Statistical Analysis, Research Methodology, Qualitative Research, Models
Peer reviewed Peer reviewed
Murray, Leigh W.; Dosser, David A., Jr. – Journal of Counseling Psychology, 1987
The use of measures of magnitude of effect has been advocated as a way to go beyond statistical tests of significance and to identify effects of a practical size. They have been used in meta-analysis to combine results of different studies. Describes problems associated with measures of magnitude of effect (particularly study size) and…
Descriptors: Effect Size, Meta Analysis, Research Design, Research Methodology
Peer reviewed Peer reviewed
Saner, Hilary – Psychometrika, 1994
The use of p-values in combining results of studies often involves studies that are potentially aberrant. This paper proposes a combined test that permits trimming some of the extreme p-values. The trimmed statistic is based on an inverse cumulative normal transformation of the ordered p-values. (SLD)
Descriptors: Effect Size, Meta Analysis, Research Methodology, Sample Size
Previous Page | Next Page ยป
Pages: 1  |  2