Publication Date
In 2025 | 1 |
Since 2024 | 4 |
Since 2021 (last 5 years) | 14 |
Since 2016 (last 10 years) | 30 |
Since 2006 (last 20 years) | 45 |
Descriptor
Difficulty Level | 95 |
Multiple Choice Tests | 95 |
Test Format | 95 |
Test Items | 79 |
Test Construction | 31 |
Higher Education | 26 |
Foreign Countries | 24 |
Item Analysis | 20 |
Comparative Analysis | 17 |
Scores | 17 |
Test Reliability | 17 |
More ▼ |
Source
Author
Publication Type
Education Level
Audience
Researchers | 3 |
Location
Netherlands | 3 |
Turkey | 3 |
Australia | 2 |
Canada | 2 |
Germany | 2 |
United Kingdom (England) | 2 |
United Kingdom (Wales) | 2 |
California | 1 |
Croatia | 1 |
Hungary | 1 |
India | 1 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Berenbon, Rebecca F.; McHugh, Bridget C. – Educational Measurement: Issues and Practice, 2023
To assemble a high-quality test, psychometricians rely on subject matter experts (SMEs) to write high-quality items. However, SMEs are not typically given the opportunity to provide input on which content standards are most suitable for multiple-choice questions (MCQs). In the present study, we explored the relationship between perceived MCQ…
Descriptors: Test Items, Multiple Choice Tests, Standards, Difficulty Level
van den Broek, Gesa S. E.; Gerritsen, Suzanne L.; Oomen, Iris T. J.; Velthoven, Eva; van Boxtel, Femke H. J.; Kester, Liesbeth; van Gog, Tamara – Journal of Educational Psychology, 2023
Multiple-choice questions (MCQs) are popular in vocabulary software because they can be scored automatically and are compatible with many input devices (e.g., touchscreens). Answering MCQs is beneficial for learning, especially when learners retrieve knowledge from memory to evaluate plausible answer alternatives. However, such retrieval may not…
Descriptors: Multiple Choice Tests, Vocabulary Development, Test Format, Cues
Herrmann-Abell, Cari F.; Hardcastle, Joseph; DeBoer, George E. – Grantee Submission, 2022
As implementation of the "Next Generation Science Standards" moves forward, there is a need for new assessments that can measure students' integrated three-dimensional science learning. The National Research Council has suggested that these assessments be multicomponent tasks that utilize a combination of item formats including…
Descriptors: Multiple Choice Tests, Conditioning, Test Items, Item Response Theory
Ashish Gurung; Kirk Vanacore; Andrew A. McReynolds; Korinn S. Ostrow; Eamon S. Worden; Adam C. Sales; Neil T. Heffernan – Grantee Submission, 2024
Learning experience designers consistently balance the trade-off between open and close-ended activities. The growth and scalability of Computer Based Learning Platforms (CBLPs) have only magnified the importance of these design trade-offs. CBLPs often utilize close-ended activities (i.e. Multiple-Choice Questions [MCQs]) due to feasibility…
Descriptors: Multiple Choice Tests, Testing, Test Format, Computer Assisted Testing
Qian Liu; Navé Wald; Chandima Daskon; Tony Harland – Innovations in Education and Teaching International, 2024
This qualitative study looks at multiple-choice questions (MCQs) in examinations and their effectiveness in testing higher-order cognition. While there are claims that MCQs can do this, we consider many assertions problematic because of the difficulty in interpreting what higher-order cognition consists of and whether or not assessment tasks…
Descriptors: Multiple Choice Tests, Critical Thinking, College Faculty, Student Evaluation
Musa Adekunle Ayanwale – Discover Education, 2023
Examination scores obtained by students from the West African Examinations Council (WAEC), and National Business and Technical Examinations Board (NABTEB) may not be directly comparable due to differences in examination administration, item characteristics of the subject in question, and student abilities. For more accurate comparisons, scores…
Descriptors: Equated Scores, Mathematics Tests, Test Items, Test Format
Matejak Cvenic, Karolina; Planinic, Maja; Susac, Ana; Ivanjek, Lana; Jelicic, Katarina; Hopf, Martin – Physical Review Physics Education Research, 2022
A new diagnostic instrument, the Conceptual Survey on Wave Optics (CSWO), was developed and validated on 224 high school students (aged 18-19 years) in Croatia. The process of test construction, which included the testing of 61 items on the total of 712 students is presented. The final version of the test consists of 26 multiple-choice items which…
Descriptors: Scientific Concepts, Concept Formation, Validity, Physics
Merzougui, Wassim H.; Myers, Matthew A.; Hall, Samuel; Elmansouri, Ahmad; Parker, Rob; Robson, Alistair D.; Kurn, Octavia; Parrott, Rachel; Geoghegan, Kate; Harrison, Charlotte H.; Anbu, Deepika; Dean, Oliver; Border, Scott – Anatomical Sciences Education, 2021
Methods of assessment in anatomy vary across medical schools in the United Kingdom (UK) and beyond; common methods include written, spotter, and oral assessment. However, there is limited research evaluating these methods in regards to student performance and perception. The National Undergraduate Neuroanatomy Competition (NUNC) is held annually…
Descriptors: Multiple Choice Tests, Test Format, Medical Students, Foreign Countries
Fadillah, Sarah Meilani; Ha, Minsu; Nuraeni, Eni; Indriyanti, Nurma Yunita – Malaysian Journal of Learning and Instruction, 2023
Purpose: Researchers discovered that when students were given the opportunity to change their answers, a majority changed their responses from incorrect to correct, and this change often increased the overall test results. What prompts students to modify their answers? This study aims to examine the modification of scientific reasoning test, with…
Descriptors: Science Tests, Multiple Choice Tests, Test Items, Decision Making
Yangqiuting Li; Chandralekha Singh – Physical Review Physics Education Research, 2025
Research-based multiple-choice questions implemented in class with peer instruction have been shown to be an effective tool for improving students' engagement and learning outcomes. Moreover, multiple-choice questions that are carefully sequenced to build on each other can be particularly helpful for students to develop a systematic understanding…
Descriptors: Physics, Science Instruction, Science Tests, Multiple Choice Tests
Roman O. Lesnov – International Journal of Listening, 2024
Whether visual information belongs in second language (L2) listening tests has long been a subject for scholarly debate, with L2 learners' performance on and perceptions of video-based tests being the primary sources of evidence. The research into L2 teachers' perceptions, however, is scarce, as is the research into stakeholders' views of content…
Descriptors: Listening Comprehension Tests, Language Tests, Second Language Learning, Second Language Instruction
Herrmann-Abell, Cari F.; Hardcastle, Joseph; DeBoer, George E. – Grantee Submission, 2019
The "Next Generation Science Standards" calls for new assessments that measure students' integrated three-dimensional science learning. The National Research Council has suggested that these assessments utilize a combination of item formats including constructed-response and multiple-choice. In this study, students were randomly assigned…
Descriptors: Science Tests, Multiple Choice Tests, Test Format, Test Items
Klender, Sara; Ferriby, Andrew; Notebaert, Andrew – HAPS Educator, 2019
Multiple-choice questions (MCQ) are commonly used on histology examinations. There are many guidelines for how to properly write MCQ and many of them recommend avoiding negatively worded stems. The current study aims to investigate differences between positively and negatively worded stems in a medical histology course by comparing the item…
Descriptors: Multiple Choice Tests, Science Tests, Biology, Test Construction
Becker, Anthony; Nekrasova-Beker, Tatiana – Educational Assessment, 2018
While previous research has identified numerous factors that contribute to item difficulty, studies involving large-scale reading tests have provided mixed results. This study examined five selected-response item types used to measure reading comprehension in the Pearson Test of English Academic: a) multiple-choice (choose one answer), b)…
Descriptors: Reading Comprehension, Test Items, Reading Tests, Test Format
Loudon, Catherine; Macias-Muñoz, Aide – Advances in Physiology Education, 2018
Different versions of multiple-choice exams were administered to an undergraduate class in human physiology as part of normal testing in the classroom. The goal was to evaluate whether the number of options (possible answers) per question influenced the effectiveness of this assessment. Three exams (each with three versions) were given to each of…
Descriptors: Multiple Choice Tests, Test Construction, Test Items, Science Tests