Publication Date
In 2025 | 1 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 2 |
Since 2016 (last 10 years) | 6 |
Since 2006 (last 20 years) | 8 |
Descriptor
Higher Education | 115 |
Multiple Choice Tests | 115 |
Test Format | 115 |
Test Items | 60 |
Test Construction | 42 |
Difficulty Level | 26 |
Test Reliability | 22 |
College Students | 20 |
Item Analysis | 20 |
Test Validity | 20 |
Comparative Testing | 19 |
More ▼ |
Source
Author
Publication Type
Education Level
Higher Education | 8 |
Postsecondary Education | 6 |
Elementary Secondary Education | 1 |
Audience
Practitioners | 8 |
Researchers | 8 |
Teachers | 5 |
Students | 2 |
Location
Canada | 3 |
Israel | 3 |
Australia | 1 |
Belgium | 1 |
Europe | 1 |
Finland | 1 |
Turkey | 1 |
United Kingdom (Great Britain) | 1 |
United Kingdom (Wales) | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Yusuf Oc; Hela Hassen – Marketing Education Review, 2025
Driven by technological innovations, continuous digital expansion has transformed fundamentally the landscape of modern higher education, leading to discussions about evaluation techniques. The emergence of generative artificial intelligence raises questions about reliability and academic honesty regarding multiple-choice assessments in online…
Descriptors: Higher Education, Multiple Choice Tests, Computer Assisted Testing, Electronic Learning
Yilmaz, Erdi Okan; Toker, Türker – International Journal of Psychology and Educational Studies, 2022
This study examines the online assessment-evaluation activities in distance education processes. The effects of different online exam application styles considering the online assessment-evaluation in distance education processes, including all programs of a higher education institution, were documented. The population for online…
Descriptors: Foreign Countries, Computer Assisted Testing, Test Format, Distance Education
Miller, Ronald Mellado; Andrade, Maureen Snow – Research & Practice in Assessment, 2020
Technology use is increasing in higher education, particularly for test administration. In this study, Capaldi's (1994) sequential theory, which postulates that the specific order of reinforcements and nonreinforcements influences persistence in the face of difficulty or failure, was applied to online multiple choice testing situations in regard…
Descriptors: Computer Assisted Testing, Higher Education, Multiple Choice Tests, Test Format
Arneson, Amy – ProQuest LLC, 2019
This three-paper dissertation explores item cluster-based assessments, first in general as it relates to modeling, and then, specific issues surrounding a particular item cluster-based assessment designed. There should be a reasonable analogy between the structure of a psychometric model and the cognitive theory that the assessment is based upon.…
Descriptors: Item Response Theory, Test Items, Critical Thinking, Cognitive Tests
Haladyna, Thomas M. – IDEA Center, Inc., 2018
Writing multiple-choice test items to measure student learning in higher education is a challenge. Based on extensive scholarly research and experience, the author describes various item formats, offers guidelines for creating these items, and provides many examples of both good and bad test items. He also suggests some shortcuts for developing…
Descriptors: Test Construction, Multiple Choice Tests, Test Items, Higher Education
Jhangiani, Rajiv S. – Teaching of Psychology, 2016
The present study investigates the impact of participation in a peer assessment activity on subsequent academic performance. Students in two sections of an introductory psychology course completed a practice quiz 1 week prior to each of three course exams. Students in the experimental group participated in a five-step double-blind peer assessment…
Descriptors: Peer Evaluation, Academic Achievement, Formative Evaluation, Summative Evaluation
Tauber, Robert T. – 1984
A technique is described for reducing the incidence of cheating on multiple choice exams. One form of the test is used and each item is assigned multiple numbers. Depending upon the instructions given to the class, some students will use the first of each pair of numbers to determine where to place their responses on a separate answer sheet, while…
Descriptors: Answer Sheets, Cheating, Higher Education, Multiple Choice Tests

Dodd, David K.; Leal, Linda – Teaching of Psychology, 1988
Discusses answer justification, a technique that allows students to convert multiple-choice items perceived to be "tricky" into short-answer essay questions. Convincing justifications earn students credit for missed items. The procedure is reported to be easy to administer and very popular among students. (Author/GEA)
Descriptors: Guessing (Tests), Higher Education, Multiple Choice Tests, Psychology

Houston, John P. – Journal of Educational Psychology, 1983
Using an index of answer copying developed by Houston, it was found that rearranged questions alone did not reduce answer copying, whereas rearrangement of both questions and answers effectively eliminated detectable cheating. (Author)
Descriptors: Cheating, Higher Education, Measurement Techniques, Multiple Choice Tests

Gay, Lorraine R. – Journal of Educational Measurement, 1980
The influence of test format on retention of research concepts and procedures on a final examination was investigated. The test formats studied were multiple choice and short answer. (Author/JKS)
Descriptors: Higher Education, Multiple Choice Tests, Retention (Psychology), Student Attitudes
Crehan, Kevin; Haladyna, Thomas M. – 1989
The present study involved the testing of two common multiple-choice item writing rules. A recent review of research revealed that much of the advice given for writing multiple-choice test items is based on experience and wisdom rather than on empirical research. The rules assessed in this study include: (1) the phrasing of the stem in the form of…
Descriptors: College Students, Higher Education, Multiple Choice Tests, Psychology

Kolstad, Rosemarie K.; And Others – Educational Research Quarterly, 1983
Complex multiple choice (CMC) items are frequently used to test knowledge about repetitive information. In two independent comparisons, performance on the CMC items surpassed that of the multiple true-false clusters. Data indicate that performance on CMC items is inflated, and distractors on CMC items fail to prevent guessing. (Author/PN)
Descriptors: Guessing (Tests), Higher Education, Multiple Choice Tests, Objective Tests

Crehan, Kevin; Haladyna, Thomas M. – Journal of Experimental Education, 1991
Two item-writing rules were tested: phrasing stems as questions versus partial sentences; and using the "none-of-the-above" option instead of a specific content option. Results with 228 college students do not support the use of either stem type and provide limited evidence to caution against the "none-of-the-above" option.…
Descriptors: College Students, Higher Education, Multiple Choice Tests, Test Construction
Ferguson, William F. – 1983
College undergraduates (n=38) were administered identical multiple choice tests with randomly presented answer-sheets numbered either vertically or horizontally. Of the originally-scheduled four tests during the semester, tests one and three were retested with entirely different test questions, also multiple choice, resulting in scores from tests,…
Descriptors: Answer Sheets, Cheating, Higher Education, Multiple Choice Tests

Tollefson, Nona – Educational and Psychological Measurement, 1987
This study compared the item difficulty, item discrimination, and test reliability of three forms of multiple-choice items: (1) one correct answer; (2) "none of the above" as a foil; and (3) "none of the above" as the correct answer. Twelve items in the three formats were administered in a college statistics examination. (BS)
Descriptors: Difficulty Level, Higher Education, Item Analysis, Multiple Choice Tests