Descriptor
| Computer Simulation | 3 |
| Guessing (Tests) | 3 |
| Multiple Choice Tests | 3 |
| Comparative Analysis | 2 |
| Scoring Formulas | 2 |
| Difficulty Level | 1 |
| Essay Tests | 1 |
| Mathematical Models | 1 |
| Psychometrics | 1 |
| Statistical Analysis | 1 |
| Test Format | 1 |
| More ▼ | |
Author
| Frary, Robert B. | 3 |
| Garcia-Perez, Miguel A. | 1 |
Publication Type
| Journal Articles | 3 |
| Reports - Research | 3 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Peer reviewedFrary, Robert B. – Journal of Educational Measurement, 1989
Responses to a 50-item, 4-choice test were simulated for 1,000 examinees under conventional formula-scoring instructions. Based on 192 simulation runs, formula scores and expected formula scores were determined for each examinee allowing and not allowing for inappropriate omissions. (TJH)
Descriptors: Computer Simulation, Difficulty Level, Guessing (Tests), Multiple Choice Tests
Peer reviewedGarcia-Perez, Miguel A.; Frary, Robert B. – Applied Psychological Measurement, 1989
Simulation techniques were used to generate conventional test responses and track the proportion of alternatives examinees could classify independently before and after taking the test. Finite-state scores were compared with these actual values and with number-correct and formula scores. Finite-state scores proved useful. (TJH)
Descriptors: Comparative Analysis, Computer Simulation, Guessing (Tests), Mathematical Models
Peer reviewedFrary, Robert B. – Journal of Educational Measurement, 1985
Responses to a sample test were simulated for examinees under free-response and multiple-choice formats. Test score sets were correlated with randomly generated sets of unit-normal measures. The extent of superiority of free response tests was sufficiently small so that other considerations might justifiably dictate format choice. (Author/DWH)
Descriptors: Comparative Analysis, Computer Simulation, Essay Tests, Guessing (Tests)


