Descriptor
| Item Analysis | 3 |
| Research Reports | 3 |
| Scoring Formulas | 3 |
| Test Validity | 3 |
| Measurement Techniques | 2 |
| Multiple Choice Tests | 2 |
| Performance Criteria | 2 |
| Scoring | 2 |
| Standardized Tests | 2 |
| Tables (Data) | 2 |
| Test Construction | 2 |
| More ▼ | |
Source
Publication Type
| Reports - Research | 1 |
| Speeches/Meeting Papers | 1 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Echternacht, Gary – 1973
Estimates for the variance of empirically determined scoring weights are given. It is shown that test item writers should write distractors that discriminate on the criterion variable when this type of scoring is used. (Author)
Descriptors: Item Analysis, Measurement Techniques, Multiple Choice Tests, Performance Criteria
Echternacht, Gary – 1973
This study compares various item option scoring methods with respect to coefficient alpha and a concurrent validity coefficient. The scoring methods under consideration were: (1) formula scoring, (2) a priori scoring, (3) empirical scoring with an internal criterion, and (4) two modifications of formula scoring. The study indicates a clear…
Descriptors: Item Analysis, Measurement Techniques, Multiple Choice Tests, Performance Criteria
Smith, Richard M.; Mitchell, Virginia P. – 1979
To improve the accuracy of college placement, Rasch scoring and person-fit statistics on the Comparative Guidance and Placement test (CGP) was compared to the traditional right-only scoring. Correlations were calculated between English and mathematics course grades and scores of 1,448 entering freshmen on the reading, writing, and mathematics…
Descriptors: Academic Ability, Computer Programs, Difficulty Level, Goodness of Fit


