Descriptor
| Mathematical Models | 2 |
| Statistical Analysis | 2 |
| Statistical Bias | 2 |
| Test Items | 2 |
| Achievement Tests | 1 |
| Comparative Analysis | 1 |
| Correlation | 1 |
| Error of Measurement | 1 |
| Goodness of Fit | 1 |
| Item Analysis | 1 |
| Junior High Schools | 1 |
| More ▼ | |
Source
| Journal of Educational… | 1 |
Publication Type
| Reports - Research | 2 |
| Journal Articles | 1 |
Education Level
Audience
Location
| South Carolina | 1 |
Laws, Policies, & Programs
Assessments and Surveys
| Comprehensive Tests of Basic… | 2 |
What Works Clearinghouse Rating
Peer reviewedHuynh, Huynh; Saunders, Joseph C. – Journal of Educational Measurement, 1980
Single administration (beta-binomial) estimates for the raw agreement index p and the corrected-for-chance kappa index in mastery testing are compared with those based on two test administrations in terms of estimation bias and sampling variability. Bias is about 2.5 percent for p and 10 percent for kappa. (Author/RL)
Descriptors: Comparative Analysis, Error of Measurement, Mastery Tests, Mathematical Models
Yen, Wendy M. – 1979
Three test-analysis models were used to analyze three types of simulated test score data plus the results of eight achievement tests. Chi-square goodness-of-fit statistics were used to evaluate the appropriateness of the models to the four kinds of data. Data were generated to simulate the responses of 1,000 students to 36 pseudo-items by…
Descriptors: Achievement Tests, Correlation, Goodness of Fit, Item Analysis


