Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 1 |
| Since 2017 (last 10 years) | 2 |
| Since 2007 (last 20 years) | 3 |
Descriptor
| Multiple Choice Tests | 8 |
| Test Format | 8 |
| Validity | 8 |
| Test Items | 7 |
| Test Construction | 4 |
| Comparative Analysis | 3 |
| Correlation | 2 |
| Difficulty Level | 2 |
| Foreign Countries | 2 |
| Higher Education | 2 |
| Classification | 1 |
| More ▼ | |
Source
| Applied Measurement in… | 1 |
| European Educational Research… | 1 |
| Journal of Experimental… | 1 |
| Language Testing | 1 |
| Physical Review Physics… | 1 |
| Teaching of Psychology | 1 |
Author
| Haladyna, Thomas M. | 3 |
| Crehan, Kevin | 1 |
| Dickson, K. Laurie | 1 |
| Downing, Steven M. | 1 |
| Dudley, Albert | 1 |
| Funk, Steven C. | 1 |
| Gerick, Julia | 1 |
| Goldhammer, Frank | 1 |
| Hopf, Martin | 1 |
| Ihme, Jan Marten | 1 |
| Ivanjek, Lana | 1 |
| More ▼ | |
Publication Type
| Journal Articles | 6 |
| Reports - Research | 6 |
| Books | 1 |
| Guides - Non-Classroom | 1 |
| Information Analyses | 1 |
| Reports - Evaluative | 1 |
| Speeches/Meeting Papers | 1 |
| Tests/Questionnaires | 1 |
Education Level
| High Schools | 1 |
| Higher Education | 1 |
| Secondary Education | 1 |
Audience
| Practitioners | 1 |
| Students | 1 |
| Teachers | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Matejak Cvenic, Karolina; Planinic, Maja; Susac, Ana; Ivanjek, Lana; Jelicic, Katarina; Hopf, Martin – Physical Review Physics Education Research, 2022
A new diagnostic instrument, the Conceptual Survey on Wave Optics (CSWO), was developed and validated on 224 high school students (aged 18-19 years) in Croatia. The process of test construction, which included the testing of 61 items on the total of 712 students is presented. The final version of the test consists of 26 multiple-choice items which…
Descriptors: Scientific Concepts, Concept Formation, Validity, Physics
Ihme, Jan Marten; Senkbeil, Martin; Goldhammer, Frank; Gerick, Julia – European Educational Research Journal, 2017
The combination of different item formats is found quite often in large scale assessments, and analyses on the dimensionality often indicate multi-dimensionality of tests regarding the task format. In ICILS 2013, three different item types (information-based response tasks, simulation tasks, and authoring tasks) were used to measure computer and…
Descriptors: Foreign Countries, Computer Literacy, Information Literacy, International Assessment
Funk, Steven C.; Dickson, K. Laurie – Teaching of Psychology, 2011
The authors experimentally investigated the effects of multiple-choice and short-answer format exam items on exam performance in a college classroom. They randomly assigned 50 students to take a 10-item short-answer pretest or posttest on two 50-item multiple-choice exams in an introduction to personality course. Students performed significantly…
Descriptors: Test Items, Test Format, Multiple Choice Tests, Validity
Peer reviewedHaladyna, Thomas M.; Downing, Steven M. – Applied Measurement in Education, 1989
Results of 96 theoretical/empirical studies were reviewed to see if they support a taxonomy of 43 rules for writing multiple-choice test items. The taxonomy is the result of an analysis of 46 textbooks dealing with multiple-choice item writing. For nearly half of the rules, no research was found. (SLD)
Descriptors: Classification, Literature Reviews, Multiple Choice Tests, Test Construction
Haladyna, Thomas M. – 1999
This book explains writing effective multiple-choice test items and studying responses to items to evaluate and improve them, two topics that are very important in the development of many cognitive tests. The chapters are: (1) "Providing a Context for Multiple-Choice Testing"; (2) "Constructed-Response and Multiple-Choice Item Formats"; (3)…
Descriptors: Constructed Response, Multiple Choice Tests, Test Construction, Test Format
Peer reviewedCrehan, Kevin; Haladyna, Thomas M. – Journal of Experimental Education, 1991
Two item-writing rules were tested: phrasing stems as questions versus partial sentences; and using the "none-of-the-above" option instead of a specific content option. Results with 228 college students do not support the use of either stem type and provide limited evidence to caution against the "none-of-the-above" option.…
Descriptors: College Students, Higher Education, Multiple Choice Tests, Test Construction
Dudley, Albert – Language Testing, 2006
This study examined the multiple true-false (MTF) test format in second language testing by comparing multiple-choice (MCQ) and multiple true-false (MTF) test formats in two language areas of general English: vocabulary and reading. Two counter-balanced experimental designs--one for each language area--were examined in terms of the number of MCQ…
Descriptors: Second Language Learning, Test Format, Validity, Testing
Maihoff, N. A.; Mehrens, Wm. A. – 1985
A comparison is presented of alternate-choice and true-false item forms used in an undergraduate natural science course. The alternate-choice item is a modified two-choice multiple-choice item in which the two responses are included within the question stem. This study (1) compared the difficulty level, discrimination level, reliability, and…
Descriptors: Classroom Environment, College Freshmen, Comparative Analysis, Comparative Testing

Direct link
