NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 260 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Janet Mee; Ravi Pandian; Justin Wolczynski; Amy Morales; Miguel Paniagua; Polina Harik; Peter Baldwin; Brian E. Clauser – Advances in Health Sciences Education, 2024
Recent advances in automated scoring technology have made it practical to replace multiple-choice questions (MCQs) with short-answer questions (SAQs) in large-scale, high-stakes assessments. However, most previous research comparing these formats has used small examinee samples testing under low-stakes conditions. Additionally, previous studies…
Descriptors: Multiple Choice Tests, High Stakes Tests, Test Format, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Berenbon, Rebecca F.; McHugh, Bridget C. – Educational Measurement: Issues and Practice, 2023
To assemble a high-quality test, psychometricians rely on subject matter experts (SMEs) to write high-quality items. However, SMEs are not typically given the opportunity to provide input on which content standards are most suitable for multiple-choice questions (MCQs). In the present study, we explored the relationship between perceived MCQ…
Descriptors: Test Items, Multiple Choice Tests, Standards, Difficulty Level
Sebastian Moncaleano – ProQuest LLC, 2021
The growth of computer-based testing over the last two decades has motivated the creation of innovative item formats. It is often argued that technology-enhanced items (TEIs) provide better measurement of test-takers' knowledge, skills, and abilities by increasing the authenticity of tasks presented to test-takers (Sireci & Zenisky, 2006).…
Descriptors: Computer Assisted Testing, Test Format, Test Items, Classification
Peer reviewed Peer reviewed
PDF on ERIC Download full text
McGuire, Michael J. – International Journal for the Scholarship of Teaching and Learning, 2023
College students in a lower-division psychology course made metacognitive judgments by predicting and postdicting performance for true-false, multiple-choice, and fill-in-the-blank question sets on each of three exams. This study investigated which question format would result in the most accurate metacognitive judgments. Extending Koriat's (1997)…
Descriptors: Metacognition, Multiple Choice Tests, Accuracy, Test Format
Peer reviewed Peer reviewed
Direct linkDirect link
van den Broek, Gesa S. E.; Gerritsen, Suzanne L.; Oomen, Iris T. J.; Velthoven, Eva; van Boxtel, Femke H. J.; Kester, Liesbeth; van Gog, Tamara – Journal of Educational Psychology, 2023
Multiple-choice questions (MCQs) are popular in vocabulary software because they can be scored automatically and are compatible with many input devices (e.g., touchscreens). Answering MCQs is beneficial for learning, especially when learners retrieve knowledge from memory to evaluate plausible answer alternatives. However, such retrieval may not…
Descriptors: Multiple Choice Tests, Vocabulary Development, Test Format, Cues
Peer reviewed Peer reviewed
Direct linkDirect link
Guo, Wenjing; Wind, Stefanie A. – Journal of Educational Measurement, 2021
The use of mixed-format tests made up of multiple-choice (MC) items and constructed response (CR) items is popular in large-scale testing programs, including the National Assessment of Educational Progress (NAEP) and many district- and state-level assessments in the United States. Rater effects, or raters' scoring tendencies that result in…
Descriptors: Test Format, Multiple Choice Tests, Scoring, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Lawrence T. DeCarlo – Educational and Psychological Measurement, 2024
A psychological framework for different types of items commonly used with mixed-format exams is proposed. A choice model based on signal detection theory (SDT) is used for multiple-choice (MC) items, whereas an item response theory (IRT) model is used for open-ended (OE) items. The SDT and IRT models are shown to share a common conceptualization…
Descriptors: Test Format, Multiple Choice Tests, Item Response Theory, Models
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Herrmann-Abell, Cari F.; Hardcastle, Joseph; DeBoer, George E. – Grantee Submission, 2022
As implementation of the "Next Generation Science Standards" moves forward, there is a need for new assessments that can measure students' integrated three-dimensional science learning. The National Research Council has suggested that these assessments be multicomponent tasks that utilize a combination of item formats including…
Descriptors: Multiple Choice Tests, Conditioning, Test Items, Item Response Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Qian Liu; Navé Wald; Chandima Daskon; Tony Harland – Innovations in Education and Teaching International, 2024
This qualitative study looks at multiple-choice questions (MCQs) in examinations and their effectiveness in testing higher-order cognition. While there are claims that MCQs can do this, we consider many assertions problematic because of the difficulty in interpreting what higher-order cognition consists of and whether or not assessment tasks…
Descriptors: Multiple Choice Tests, Critical Thinking, College Faculty, Student Evaluation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Wolkowitz, Amanda A.; Foley, Brett; Zurn, Jared – Practical Assessment, Research & Evaluation, 2023
The purpose of this study is to introduce a method for converting scored 4-option multiple-choice (MC) items into scored 3-option MC items without re-pretesting the 3-option MC items. This study describes a six-step process for achieving this goal. Data from a professional credentialing exam was used in this study and the method was applied to 24…
Descriptors: Multiple Choice Tests, Test Items, Accuracy, Test Format
Peer reviewed Peer reviewed
Direct linkDirect link
Wang, Yu; Chiu, Chia-Yi; Köhn, Hans Friedrich – Journal of Educational and Behavioral Statistics, 2023
The multiple-choice (MC) item format has been widely used in educational assessments across diverse content domains. MC items purportedly allow for collecting richer diagnostic information. The effectiveness and economy of administering MC items may have further contributed to their popularity not just in educational assessment. The MC item format…
Descriptors: Multiple Choice Tests, Nonparametric Statistics, Test Format, Educational Assessment
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Uysal, Ibrahim; Sahin-Kürsad, Merve; Kiliç, Abdullah Faruk – Participatory Educational Research, 2022
The aim of the study was to examine the common items in the mixed format (e.g., multiple-choices and essay items) contain parameter drifts in the test equating processes performed with the common item nonequivalent groups design. In this study, which was carried out using Monte Carlo simulation with a fully crossed design, the factors of test…
Descriptors: Test Items, Test Format, Item Response Theory, Equated Scores
Peer reviewed Peer reviewed
Direct linkDirect link
Olsho, Alexis; Smith, Trevor I.; Eaton, Philip; Zimmerman, Charlotte; Boudreaux, Andrew; White Brahmia, Suzanne – Physical Review Physics Education Research, 2023
We developed the Physics Inventory of Quantitative Literacy (PIQL) to assess students' quantitative reasoning in introductory physics contexts. The PIQL includes several "multiple-choice-multipleresponse" (MCMR) items (i.e., multiple-choice questions for which more than one response may be selected) as well as traditional single-response…
Descriptors: Multiple Choice Tests, Science Tests, Physics, Measures (Individuals)
Peer reviewed Peer reviewed
Direct linkDirect link
Musa Adekunle Ayanwale – Discover Education, 2023
Examination scores obtained by students from the West African Examinations Council (WAEC), and National Business and Technical Examinations Board (NABTEB) may not be directly comparable due to differences in examination administration, item characteristics of the subject in question, and student abilities. For more accurate comparisons, scores…
Descriptors: Equated Scores, Mathematics Tests, Test Items, Test Format
Peer reviewed Peer reviewed
Direct linkDirect link
Stefan O'Grady – International Journal of Listening, 2025
Language assessment is increasingly computermediated. This development presents opportunities with new task formats and equally a need for renewed scrutiny of established conventions. Recent recommendations to increase integrated skills assessment in lecture comprehension tests is premised on empirical research that demonstrates enhanced construct…
Descriptors: Language Tests, Lecture Method, Listening Comprehension Tests, Multiple Choice Tests
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9  |  10  |  11  |  ...  |  18