NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 5 results Save | Export
Peer reviewed Peer reviewed
Garg, Rashmi; And Others – Journal of Educational Measurement, 1986
For the purpose of obtaining data to use in test development, multiple matrix sampling plans were compared to examinee sampling plans. Data were simulated for examinees, sampled from a population with a normal distribution of ability, responding to items selected from an item universe. (Author/LMO)
Descriptors: Difficulty Level, Monte Carlo Methods, Sampling, Statistical Studies
Rogers, H. Jane; Hambleton, Ronald K. – 1987
Though item bias statistics are widely recommended for use in test development and analysis, problems arise in their interpretation. This research evaluates logistic test models and computer simulation methods for providing a frame of reference for interpreting item bias statistics. Specifically, the intent was to produce simulated sampling…
Descriptors: Computer Simulation, Cutting Scores, Grade 9, Latent Trait Theory
Cook, Linda L.; Petersen, Nancy S. – 1986
This paper examines how various equating methods are affected by: (1) sampling error; (2) sample characteristics; and (3) characteristics of anchor test items. It reviews empirical studies that investigated the invariance of equating transformations, and it discusses empirical and simulation studies that focus on how the properties of anchor tests…
Descriptors: Educational Research, Equated Scores, Error of Measurement, Evaluation Methods
Farish, Stephen J. – 1984
The stability of Rasch test item difficulty parameters was investigated under varying conditions. Data were taken from a mathematics achievement test administered to over 2,000 Australian students. The experiments included: (1) relative stability of the Rasch, traditional, and z-item difficulty parameters using different sample sizes and designs;…
Descriptors: Achievement Tests, Difficulty Level, Estimation (Mathematics), Foreign Countries
Johnson, Eugene G.; And Others – 1994
The 1992 National Assessment of Educational Progress (NAEP) monitored the performance of students in American schools in reading, mathematics, science, and writing. The sample consisted of more than 145,000 public and private school students in grades 4, 8, 11, and 12. This technical report provides details of instrument development, sample…
Descriptors: Academic Achievement, Data Analysis, Data Collection, Educational Assessment