Publication Date
In 2025 | 1 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 13 |
Since 2016 (last 10 years) | 30 |
Since 2006 (last 20 years) | 58 |
Descriptor
Comparative Analysis | 93 |
Multiple Choice Tests | 93 |
Test Format | 93 |
Test Items | 41 |
Foreign Countries | 29 |
Statistical Analysis | 20 |
Scores | 18 |
Difficulty Level | 17 |
Higher Education | 15 |
Test Reliability | 15 |
Computer Assisted Testing | 14 |
More ▼ |
Source
Author
Publication Type
Education Level
Higher Education | 27 |
Postsecondary Education | 24 |
Secondary Education | 9 |
High Schools | 7 |
Elementary Education | 5 |
Elementary Secondary Education | 2 |
Early Childhood Education | 1 |
Grade 12 | 1 |
Grade 2 | 1 |
Grade 5 | 1 |
Grade 9 | 1 |
More ▼ |
Audience
Practitioners | 2 |
Administrators | 1 |
Teachers | 1 |
Location
Sweden | 3 |
California | 2 |
Israel | 2 |
Netherlands | 2 |
Taiwan | 2 |
Belgium | 1 |
Canada | 1 |
Canada (Ottawa) | 1 |
Chile (Santiago) | 1 |
Czech Republic | 1 |
Finland | 1 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Herrmann-Abell, Cari F.; Hardcastle, Joseph; DeBoer, George E. – Grantee Submission, 2022
As implementation of the "Next Generation Science Standards" moves forward, there is a need for new assessments that can measure students' integrated three-dimensional science learning. The National Research Council has suggested that these assessments be multicomponent tasks that utilize a combination of item formats including…
Descriptors: Multiple Choice Tests, Conditioning, Test Items, Item Response Theory
Olsho, Alexis; Smith, Trevor I.; Eaton, Philip; Zimmerman, Charlotte; Boudreaux, Andrew; White Brahmia, Suzanne – Physical Review Physics Education Research, 2023
We developed the Physics Inventory of Quantitative Literacy (PIQL) to assess students' quantitative reasoning in introductory physics contexts. The PIQL includes several "multiple-choice-multipleresponse" (MCMR) items (i.e., multiple-choice questions for which more than one response may be selected) as well as traditional single-response…
Descriptors: Multiple Choice Tests, Science Tests, Physics, Measures (Individuals)
Musa Adekunle Ayanwale – Discover Education, 2023
Examination scores obtained by students from the West African Examinations Council (WAEC), and National Business and Technical Examinations Board (NABTEB) may not be directly comparable due to differences in examination administration, item characteristics of the subject in question, and student abilities. For more accurate comparisons, scores…
Descriptors: Equated Scores, Mathematics Tests, Test Items, Test Format
Gorney, Kylie; Wollack, James A. – Practical Assessment, Research & Evaluation, 2022
Unlike the traditional multiple-choice (MC) format, the discrete-option multiple-choice (DOMC) format does not necessarily reveal all answer options to an examinee. The purpose of this study was to determine whether the reduced exposure of item content affects test security. We conducted an experiment in which participants were allowed to view…
Descriptors: Test Items, Test Format, Multiple Choice Tests, Item Analysis
Yangqiuting Li; Chandralekha Singh – Physical Review Physics Education Research, 2025
Research-based multiple-choice questions implemented in class with peer instruction have been shown to be an effective tool for improving students' engagement and learning outcomes. Moreover, multiple-choice questions that are carefully sequenced to build on each other can be particularly helpful for students to develop a systematic understanding…
Descriptors: Physics, Science Instruction, Science Tests, Multiple Choice Tests
Lee, Won-Chan; Kim, Stella Y.; Choi, Jiwon; Kang, Yujin – Journal of Educational Measurement, 2020
This article considers psychometric properties of composite raw scores and transformed scale scores on mixed-format tests that consist of a mixture of multiple-choice and free-response items. Test scores on several mixed-format tests are evaluated with respect to conditional and overall standard errors of measurement, score reliability, and…
Descriptors: Raw Scores, Item Response Theory, Test Format, Multiple Choice Tests
Shear, Benjamin R. – Journal of Educational Measurement, 2023
Large-scale standardized tests are regularly used to measure student achievement overall and for student subgroups. These uses assume tests provide comparable measures of outcomes across student subgroups, but prior research suggests score comparisons across gender groups may be complicated by the type of test items used. This paper presents…
Descriptors: Gender Bias, Item Analysis, Test Items, Achievement Tests
Viskotová, Lenka; Hampel, David – Mathematics Teaching Research Journal, 2022
Computer-aided assessment is an important tool that reduces the workload of teachers and increases the efficiency of their work. The multiple-choice test is considered to be one of the most common forms of computer-aided testing and its application for mid-term has indisputable advantages. For the purposes of a high-quality and responsible…
Descriptors: Undergraduate Students, Mathematics Tests, Computer Assisted Testing, Faculty Workload
ALKursheh, Taha Okleh; Al-zboon, Habis Saad; AlNasraween, Mo'en Salman – International Journal of Instruction, 2022
This study aimed at comparing the effect of two test item formats (multiple-choice and complete) on estimating person's ability, item parameters and the test information function (TIF).To achieve the aim of the study, two format of mathematics(1) test have been created: multiple-choice and complete, In its final format consisted of (31) items. The…
Descriptors: Comparative Analysis, Test Items, Item Response Theory, Test Format
Herrmann-Abell, Cari F.; Hardcastle, Joseph; DeBoer, George E. – Grantee Submission, 2019
The "Next Generation Science Standards" calls for new assessments that measure students' integrated three-dimensional science learning. The National Research Council has suggested that these assessments utilize a combination of item formats including constructed-response and multiple-choice. In this study, students were randomly assigned…
Descriptors: Science Tests, Multiple Choice Tests, Test Format, Test Items
Wang, Shichao; Li, Dongmei; Steedle, Jeffrey – ACT, Inc., 2021
Speeded tests set time limits so that few examinees can reach all items, and power tests allow most test-takers sufficient time to attempt all items. Educational achievement tests are sometimes described as "timed power tests" because the amount of time provided is intended to allow nearly all students to complete the test, yet this…
Descriptors: Timed Tests, Test Items, Achievement Tests, Testing
Wang, Zuowei; O'Reilly, Tenaha; Sabatini, John; McCarthy, Kathryn S.; McNamara, Danielle S. – Grantee Submission, 2021
We compared high school students' performance in a traditional comprehension assessment requiring them to identify key information and draw inferences from single texts, and a scenario-based assessment (SBA) requiring them to integrate, evaluate and apply information across multiple sources. Both assessments focused on a non-academic topic.…
Descriptors: Comparative Analysis, High School Students, Inferences, Reading Tests
Liao, Ray J. T. – Reading in a Foreign Language, 2021
This study was to compare the processes that second language learners implemented when completing reading tasks in a multiple-choice question format (MCQ) with those in a short-answer question format (SAQ). Sixteen nonnative English-speaking students from a large Midwestern college in the United States were invited to complete MCQ and SAQ English…
Descriptors: Multiple Choice Tests, Test Format, Comparative Analysis, Reading Tests
Akhavan Masoumi, Ghazal; Sadeghi, Karim – Language Testing in Asia, 2020
This study aimed to examine the effect of test format on test performance by comparing Multiple Choice (MC) and Constructed Response (CR) vocabulary tests in an EFL setting. Also, this paper investigated the function of gender in MC and CR vocabulary measures. To this end, five 20-item stem-equivalent vocabulary tests (CR, and 3-, 4-, 5-, and…
Descriptors: Language Tests, Test Items, English (Second Language), Second Language Learning
Yeager, Rebecca; Meyer, Zachary – International Journal of Listening, 2022
This study investigates the effects of adding stem preview to an English for Academic Purposes (EAP) multiple-choice listening assessment. In stem preview, listeners may view the item stems, but not response options, before listening. Previous research indicates that adding preview to an exam typically decreases difficulty, but raises concerns…
Descriptors: English for Academic Purposes, Second Language Learning, Second Language Instruction, Teaching Methods