Descriptor
| Comparative Testing | 2 |
| Higher Education | 2 |
| Multiple Choice Tests | 2 |
| Test Items | 2 |
| Achievement Tests | 1 |
| Adaptive Testing | 1 |
| Computer Assisted Testing | 1 |
| Cues | 1 |
| Difficulty Level | 1 |
| Item Analysis | 1 |
| Latent Trait Theory | 1 |
| More ▼ | |
Source
| Evaluation and the Health… | 2 |
Publication Type
| Journal Articles | 2 |
| Reports - Research | 2 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Peer reviewedHarasym, P. H.; And Others – Evaluation and the Health Professions, 1980
Coded, as opposed to free response items, in a multiple choice physiology test had a cueing effect which raised students' scores, especially for lower achievers. Reliability of coded items was also lower. Item format and scoring method had an effect on test results. (GDC)
Descriptors: Achievement Tests, Comparative Testing, Cues, Higher Education
Peer reviewedKent, Thomas H.; Albanese, Mark A. – Evaluation and the Health Professions, 1987
Two types of computer-administered unit quizzes in a systematic pathology course for second-year medical students were compared. Quizzes composed of questions selected on the basis of a student's ability had higher correlations with the final examination than did quizzes composed of questions randomly selected from topic areas. (Author/JAZ)
Descriptors: Adaptive Testing, Comparative Testing, Computer Assisted Testing, Difficulty Level


