Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 1 |
| Since 2017 (last 10 years) | 2 |
| Since 2007 (last 20 years) | 9 |
Descriptor
| Monte Carlo Methods | 17 |
| Research Design | 17 |
| Simulation | 17 |
| Educational Research | 5 |
| Research Methodology | 5 |
| Correlation | 4 |
| Power (Statistics) | 4 |
| Analysis of Covariance | 3 |
| Computation | 3 |
| Data Analysis | 3 |
| Effect Size | 3 |
| More ▼ | |
Source
Author
| Onghena, Patrick | 4 |
| Manolov, Rumen | 2 |
| Solanas, Antonio | 2 |
| Baldwin, Lee | 1 |
| Bandalos, Deborah L. | 1 |
| Barcikowski, Robert S. | 1 |
| Bell, Stephen H. | 1 |
| Ben Kelcey | 1 |
| Beretvas, S. Natasha | 1 |
| Bollen, Kenneth A. | 1 |
| Bulte, Isis | 1 |
| More ▼ | |
Publication Type
| Journal Articles | 15 |
| Reports - Research | 7 |
| Reports - Evaluative | 6 |
| Reports - Descriptive | 3 |
| Speeches/Meeting Papers | 1 |
Education Level
Audience
| Researchers | 1 |
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Kyle Cox; Ben Kelcey; Hannah Luce – Journal of Experimental Education, 2024
Comprehensive evaluation of treatment effects is aided by considerations for moderated effects. In educational research, the combination of natural hierarchical structures and prevalence of group-administered or shared facilitator treatments often produces three-level partially nested data structures. Literature details planning strategies for a…
Descriptors: Randomized Controlled Trials, Monte Carlo Methods, Hierarchical Linear Modeling, Educational Research
Heyvaert, Mieke; Moeyaert, Mariola; Verkempynck, Paul; Van den Noortgate, Wim; Vervloet, Marlies; Ugille, Maaike; Onghena, Patrick – Journal of Experimental Education, 2017
This article reports on a Monte Carlo simulation study, evaluating two approaches for testing the intervention effect in replicated randomized AB designs: two-level hierarchical linear modeling (HLM) and using the additive method to combine randomization test "p" values (RTcombiP). Four factors were manipulated: mean intervention effect,…
Descriptors: Monte Carlo Methods, Simulation, Intervention, Replication (Evaluation)
Schoemann, Alexander M.; Miller, Patrick; Pornprasertmanit, Sunthud; Wu, Wei – International Journal of Behavioral Development, 2014
Planned missing data designs allow researchers to increase the amount and quality of data collected in a single study. Unfortunately, the effect of planned missing data designs on power is not straightforward. Under certain conditions using a planned missing design will increase power, whereas in other situations using a planned missing design…
Descriptors: Monte Carlo Methods, Simulation, Sample Size, Research Design
Litaker, E. T.; Machacek, J. R.; Gay, T. J. – European Journal of Physics, 2011
We present a Monte Carlo simulation of a cylindrical luminescent volume and a typical lens-detector system. The results of this simulation yield a graphically simple picture of the regions within the cylindrical volume from which this system detects light. Because the cylindrical volume permits large angles of incidence, we use a modification of…
Descriptors: Research Design, Monte Carlo Methods, Optics, Computation
Harvill, Eleanor L.; Peck, Laura R.; Bell, Stephen H. – American Journal of Evaluation, 2013
Using exogenous characteristics to identify endogenous subgroups, the approach discussed in this method note creates symmetric subsets within treatment and control groups, allowing the analysis to take advantage of an experimental design. In order to maintain treatment--control symmetry, however, prior work has posited that it is necessary to use…
Descriptors: Experimental Groups, Control Groups, Research Design, Sampling
Dong, Nianbo – Society for Research on Educational Effectiveness, 2011
The purpose of this study is through Monte Carlo simulation to compare several propensity score methods in approximating factorial experimental design and identify best approaches in reducing bias and mean square error of parameter estimates of the main and interaction effects of two factors. Previous studies focused more on unbiased estimates of…
Descriptors: Research Design, Probability, Monte Carlo Methods, Simulation
Manolov, Rumen; Solanas, Antonio; Bulte, Isis; Onghena, Patrick – Journal of Experimental Education, 2010
This study deals with the statistical properties of a randomization test applied to an ABAB design in cases where the desirable random assignment of the points of change in phase is not possible. To obtain information about each possible data division, the authors carried out a conditional Monte Carlo simulation with 100,000 samples for each…
Descriptors: Monte Carlo Methods, Effect Size, Simulation, Evaluation Methods
Solanas, Antonio; Manolov, Rumen; Onghena, Patrick – Behavior Modification, 2010
The current study proposes a new procedure for separately estimating slope change and level change between two adjacent phases in single-case designs. The procedure eliminates baseline trend from the whole data series before assessing treatment effectiveness. The steps necessary to obtain the estimates are presented in detail, explained, and…
Descriptors: Simulation, Computation, Models, Behavioral Science Research
Peer reviewedPaxton, Pamela; Curran, Patrick J.; Bollen, Kenneth A.; Kirby, Jim; Chen, Feinian – Structural Equation Modeling, 2001
Illustrates the design and planning of Monte Carlo simulations, presenting nine steps in planning and performing a Monte Carlo analysis from developing a theoretically derived question of interest through summarizing the results. Uses a Monte Carlo simulation to illustrate many of the relevant points. (SLD)
Descriptors: Monte Carlo Methods, Research Design, Simulation, Statistical Analysis
Barcikowski, Robert S.; Elliott, Ronald S. – 1997
Research was conducted to provide educational researchers with a choice of pairwise multiple comparison procedures (P-MCPs) to use with single group repeated measures designs. The following were studied through two Monte Carlo (MC) simulations: (1) The T procedure of J. W. Tukey (1953); (2) a modification of Tukey's T (G. Keppel, 1973); (3) the…
Descriptors: Comparative Analysis, Educational Research, Monte Carlo Methods, Research Design
Peer reviewedHutchinson, Susan R.; Bandalos, Deborah L. – Journal of Vocational Education Research, 1997
Describes Monte Carlo simulation studies and their application in vocational education research. Explains study design and analysis as well as use and evaluation of results. (SK)
Descriptors: Monte Carlo Methods, Research Design, Research Utilization, Simulation
Peer reviewedRheinheimer, David C.; Penfield, Douglas A. – Journal of Experimental Education, 2001
Studied, through Monte Carlo simulation, the conditions for which analysis of covariance (ANCOVA) does not maintain adequate Type I error rates and power and evaluated some alternative tests. Discusses differences in ANCOVA robustness for balanced and unbalanced designs. (SLD)
Descriptors: Analysis of Covariance, Monte Carlo Methods, Power (Statistics), Research Design
Peer reviewedKlockars, Alan J.; Beretvas, S. Natasha – Journal of Experimental Education, 2001
Compared the Type I error rate and the power to detect differences in slopes and additive treatment effects of analysis of covariance (ANCOVA) and randomized block designs through a Monte Carlo simulation. Results show that the more powerful option in almost all simulations for tests of both slope and means was ANCOVA. (SLD)
Descriptors: Analysis of Covariance, Monte Carlo Methods, Power (Statistics), Research Design
Peer reviewedSawilowsky, Shlomo; And Others – Journal of Experimental Education, 1994
A Monte Carlo study considers the use of meta analysis with the Solomon four-group design. Experiment-wise Type I error properties and the relative power properties of Stouffer's Z in the Solomon four-group design are explored. Obstacles to conducting meta analysis in the Solomon design are discussed. (SLD)
Descriptors: Meta Analysis, Monte Carlo Methods, Power (Statistics), Research Design
Wang, Zhongmiao; Thompson, Bruce – Journal of Experimental Education, 2007
In this study the authors investigated the use of 5 (i.e., Claudy, Ezekiel, Olkin-Pratt, Pratt, and Smith) R[squared] correction formulas with the Pearson r[squared]. The authors estimated adjustment bias and precision under 6 x 3 x 6 conditions (i.e., population [rho] values of 0.0, 0.1, 0.3, 0.5, 0.7, and 0.9; population shapes normal, skewness…
Descriptors: Effect Size, Correlation, Mathematical Formulas, Monte Carlo Methods
Previous Page | Next Page ยป
Pages: 1 | 2
Direct link
