Publication Date
| In 2026 | 0 |
| Since 2025 | 89 |
| Since 2022 (last 5 years) | 457 |
| Since 2017 (last 10 years) | 1245 |
| Since 2007 (last 20 years) | 2519 |
Descriptor
Source
Author
Publication Type
Education Level
Audience
| Practitioners | 122 |
| Teachers | 105 |
| Researchers | 64 |
| Students | 46 |
| Administrators | 14 |
| Policymakers | 7 |
| Counselors | 3 |
| Parents | 3 |
Location
| Canada | 134 |
| Turkey | 131 |
| Australia | 123 |
| Iran | 66 |
| Indonesia | 61 |
| Germany | 51 |
| United Kingdom | 51 |
| Taiwan | 46 |
| United States | 43 |
| China | 39 |
| California | 35 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
| Meets WWC Standards without Reservations | 3 |
| Meets WWC Standards with or without Reservations | 5 |
| Does not meet standards | 6 |
Peer reviewedDiamond, James J.; Evans, William J. – Journal of Educational Measurement, 1972
Results lend support to the notion that test-wiseness is not a general trait, but rather is clue-specific. (Authors)
Descriptors: Cognitive Ability, Grade 6, Intelligence, Measurement Instruments
Wilbur, Paul H. – J Educ Meas, 1970
Descriptors: High School Students, Multiple Choice Tests, Patterned Responses, Responses
Peer reviewedDi Vesta, Francis; Gray, G. Susan – Journal of Educational Psychology, 1972
Subjects listened to a set of three 5-minute passages. A free-recall test and a multiple-choice test were administered at the conclusion of the experiment. It was found that the number of ideas recalled was favorably influenced by note taking, rehearsal and testing. (CK)
Descriptors: Individual Differences, Learning Processes, Listening, Multiple Choice Tests
Peer reviewedFeldman, David H.; Markwalder, Winston – Educational and Psychological Measurement, 1971
Descriptors: Cognitive Development, Cognitive Measurement, Developmental Psychology, Item Analysis
Peer reviewedBlack, Colin – English Language Teaching, 1971
Descriptors: English, Language Instruction, Language Tests, Listening Comprehension
Williams, Richard P. – J Reading Behav, 1970
Descriptors: Achievement Gains, College Freshmen, Experience, Multiple Choice Tests
Rippey, Robert M. – Psychol Rep, 1970
Descriptors: Cognitive Processes, College Students, Decision Making Skills, Multiple Choice Tests
Peer reviewedWilcox, Rand R. – Educational and Psychological Measurement, 1982
When determining criterion-referenced test length, problems of guessing are shown to be more serious than expected. A new method of scoring is presented that corrects for guessing without assuming that guessing is random. Empirical investigations of the procedure are examined. Test length can be substantially reduced. (Author/CM)
Descriptors: Criterion Referenced Tests, Guessing (Tests), Multiple Choice Tests, Scoring
Peer reviewedGross, Leon J. – Evaluation and the Health Professions, 1982
Despite the 50 percent probability of a correctly guessed response, a multiple true-false examination should provide sufficient score variability for adequate discrimination without formula scoring. This scoring system directs examinees to respond to each item, with their scores based simply on the number of correct responses. (Author/CM)
Descriptors: Achievement Tests, Guessing (Tests), Health Education, Higher Education
Peer reviewedWeiten, Wayne – Journal of Experimental Education, 1982
A comparison of double as opposed to single multiple-choice questions yielded significant differences in regard to item difficulty, item discrimination, and internal reliability, but not concurrent validity. (Author/PN)
Descriptors: Difficulty Level, Educational Testing, Higher Education, Multiple Choice Tests
Peer reviewedKolstad, Rosemarie; And Others – Journal of Dental Education, 1982
Nonrestricted-answer, multiple-choice test items are recommended as a way of including more facts and fewer incorrect answers in test items, and they do not cue successful guessing as restricted multiple choice items can. Examination construction, scoring, and reliability are discussed. (MSE)
Descriptors: Guessing (Tests), Higher Education, Item Analysis, Multiple Choice Tests
Peer reviewedMentzer, Thomas L. – Educational and Psychological Measurement, 1982
Evidence of biases in the correct answers in multiple-choice test item files were found to include "all of the above" bias in which that answer was correct more than 25 percent of the time, and a bias that the longest answer was correct too frequently. Seven bias types were studied. (Author/CM)
Descriptors: Educational Testing, Higher Education, Multiple Choice Tests, Psychology
Peer reviewedSarnacki, Randolph E. – Evaluation and the Health Professions, 1981
Effects of Test-Wiseness (TW) training and level of TW on multiple-choice item type performance, on standardized and teacher-made examinations in undergraduate medical education were considered. Conditions inherent in standardized tests must be present before a susceptibility to the extraneous source of variance of TW is evidenced. (Author/GK)
Descriptors: Higher Education, Medical Education, Multiple Choice Tests, Standardized Tests
Peer reviewedGreen, Kathy; And Others – Educational and Psychological Measurement, 1982
Achievement test reliability and validity as a function of ability were determined for multiple sections of a large undergraduate French class. Results did not support previous arguments that decreasing the number of options results in a more efficient test for high-level examinees, but less efficient for low-level examinees. (Author/GK)
Descriptors: Academic Ability, Comparative Analysis, Higher Education, Multiple Choice Tests
Peer reviewedSommerfeld, Jude T. – Chemical Engineering Education, 1981
Discusses rationale for and use of multiple choice examinations in material balances, unit operations, reactor design, and process control courses. Describes computer scoring of student reaction to, and future plans for these examinations. (SK)
Descriptors: Chemistry, College Science, Computer Assisted Testing, Engineering Education


