NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 7 results Save | Export
Flournoy, Nancy – 1989
Designs for sequential sampling procedures that adapt to cumulative information are discussed. A familiar illustration is the play-the-winner rule in which there are two treatments; after a random start, the same treatment is continued as long as each successive subject registers a success. When a failure occurs, the other treatment is used until…
Descriptors: Algorithms, Evaluation Methods, Mathematical Models, Research Design
Cardinet, Jean; Allal, Linda – New Directions for Testing and Measurement, 1983
A general framework for conducting generalizability analyses is presented. Generalizability theory is extended to situations in which the objects of measurement are not persons but other factors, such as instructional objectives, stages of learning, and treatments. (Author/PN)
Descriptors: Algorithms, Analysis of Variance, Estimation (Mathematics), Mathematical Formulas
Peer reviewed Peer reviewed
Albert, James H. – Journal of Educational Statistics, 1992
Estimating item parameters from a two-parameter normal ogive model is considered using Gibbs sampling to simulate draws from the joint posterior distribution of ability and item parameters. The method gives marginal posterior density estimates for any parameter of interest, as illustrated using data from a 33-item mathematics placement…
Descriptors: Algorithms, Bayesian Statistics, Equations (Mathematics), Estimation (Mathematics)
Peer reviewed Peer reviewed
Freedman, David A.; And Others – Evaluation Review, 1993
Techniques for adjusting census figures are discussed, with a focus on sampling error, uncertainty of estimates resulting from the luck of sample choice. Computer simulations illustrate the ways in which the smoothing algorithm may make adjustments less, rather than more, accurate. (SLD)
Descriptors: Algorithms, Census Figures, Computer Simulation, Error of Measurement
Peer reviewed Peer reviewed
Cudeck, Robert; Browne, Michael W. – Psychometrika, 1992
A method is proposed for constructing a population covariance matrix as the sum of a particular model plus a nonstochastic residual matrix, with the stipulation that the model holds with a prespecified lack of fit. The procedure is considered promising for Monte Carlo studies. (SLD)
Descriptors: Algorithms, Equations (Mathematics), Estimation (Mathematics), Factor Analysis
Mislevy, Robert J. – 1985
A method for drawing inferences from complex samples is based on Rubin's approach to missing data in survey research. Standard procedures for drawing such inferences do not apply when the variables of interest are not observed directly, but must be inferred from secondary random variables which depend on the variables of interest stochastically.…
Descriptors: Algorithms, Data Interpretation, Estimation (Mathematics), Latent Trait Theory
Peer reviewed Peer reviewed
Raudenbush, Stephen W.; And Others – Journal of Educational Statistics, 1991
A three-level multivariate statistical modeling strategy is presented that resolves the question of whether the unit of analysis should be the teacher or the student. A reanalysis of U.S. high school data (51 Catholic and 59 public schools from the High School and Beyond survey) illustrates the model. (SLD)
Descriptors: Algorithms, Catholic Schools, Educational Environment, Equations (Mathematics)