Publication Date
| In 2026 | 0 |
| Since 2025 | 1 |
| Since 2022 (last 5 years) | 5 |
| Since 2017 (last 10 years) | 12 |
| Since 2007 (last 20 years) | 16 |
Descriptor
| Multiple Choice Tests | 37 |
| Problem Solving | 37 |
| Test Construction | 37 |
| Test Items | 16 |
| Mathematics Tests | 11 |
| Elementary Secondary Education | 8 |
| Science Tests | 8 |
| Thinking Skills | 8 |
| Educational Assessment | 7 |
| Student Evaluation | 7 |
| Foreign Countries | 6 |
| More ▼ | |
Source
Author
| Masters, James R. | 2 |
| Triska, Olive H. | 2 |
| Andrew Gardiner | 1 |
| Arican, Muhammet | 1 |
| Ato Kwamina Arhin | 1 |
| Attali, Yigal | 1 |
| Bao, Lei | 1 |
| Bennett, Randy Elliot | 1 |
| Braswell, James S. | 1 |
| Brown, Rachael Eriksen | 1 |
| Chen, Qingwei | 1 |
| More ▼ | |
Publication Type
Education Level
| Elementary Education | 5 |
| Higher Education | 4 |
| Middle Schools | 4 |
| Secondary Education | 4 |
| Junior High Schools | 3 |
| Postsecondary Education | 3 |
| Grade 5 | 2 |
| Grade 3 | 1 |
| Grade 4 | 1 |
| Grade 6 | 1 |
| Grade 7 | 1 |
| More ▼ | |
Audience
| Teachers | 2 |
| Practitioners | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Testing Anatomy: Dissecting Spatial and Non-Spatial Knowledge in Multiple-Choice Question Assessment
Julie Dickson; Darren J. Shaw; Andrew Gardiner; Susan Rhind – Anatomical Sciences Education, 2024
Limited research has been conducted on the spatial ability of veterinary students and how this is evaluated within anatomy assessments. This study describes the creation and evaluation of a split design multiple-choice question (MCQ) assessment (totaling 30 questions divided into 15 non-spatial MCQs and 15 spatial MCQs). Two cohorts were tested,…
Descriptors: Anatomy, Spatial Ability, Multiple Choice Tests, Factor Analysis
Hai Li; Wanli Xing; Chenglu Li; Wangda Zhu; Simon Woodhead – Journal of Learning Analytics, 2025
Knowledge tracing (KT) is a method to evaluate a student's knowledge state (KS) based on their historical problem-solving records by predicting the next answer's binary correctness. Although widely applied to closed-ended questions, it lacks a detailed option tracing (OT) method for assessing multiple-choice questions (MCQs). This paper introduces…
Descriptors: Mathematics Tests, Multiple Choice Tests, Computer Assisted Testing, Problem Solving
Stevens, Scott P.; Palocsay, Susan W.; Novoa, Luis J. – INFORMS Transactions on Education, 2023
Test writing is a fundamental component of teaching. With increasing pressure to teach larger groups of students, conduct formal assessment of learning outcomes, and offer online and hybrid classes, there is a need for alternatives to constructed response problem-solving test questions. We believe that appropriate use of multiple-choice (MC)…
Descriptors: Multiple Choice Tests, Introductory Courses, Test Construction, Content Validity
Ato Kwamina Arhin – Acta Educationis Generalis, 2024
Introduction: This article aimed at digging deep into distractors used for mathematics multiple-choice items. The quality of distractors may be more important than their number and the stem in a multiple-choice question. Little attention is given to this aspect of item writing especially, mathematics multiple-choice questions. This article…
Descriptors: Testing, Multiple Choice Tests, Test Items, Mathematics Tests
Chen, Qingwei; Zhu, Guangtian; Liu, Qiaoyi; Han, Jing; Fu, Zhao; Bao, Lei – Physical Review Physics Education Research, 2020
Problem-solving categorization tasks have been well studied and used as an effective tool for assessment of student knowledge structure. In this study, a traditional free-response categorization test has been modified into a multiple-choice format, and the effectiveness of this new assessment is evaluated. Through randomized testing with Chinese…
Descriptors: Foreign Countries, Test Construction, Multiple Choice Tests, Problem Solving
Rudibyani, Ratu Betta; Perdana, Ryzal; Elisanti, Evi – International Journal of Instruction, 2020
The development of knowledge assessment instrument based on problem solving in the electrochemistry. This research aimed to find out the characteristics, teacher responses, and student responses to the problem-based knowledge assessment instrument on the electrochemistry material. The research method used is research and development which consists…
Descriptors: Science Tests, Student Evaluation, Test Construction, Problem Solving
Haider, Muhammad Qadeer – ProQuest LLC, 2019
Inquiry-oriented teaching is a specific form of active learning gaining popularity in teaching communities. The goal of inquiry-oriented classes is to help students in gaining a conceptual understanding of the material. My research focus is to gauge students' performance and conceptual understanding in inquiry-oriented linear algebra classes. This…
Descriptors: Mathematics Tests, Test Construction, Test Validity, Test Reliability
Arican, Muhammet – Turkish Journal of Education, 2019
This study investigated Turkish middle school students' proportional reasoning and provided a diagnostic assessment of their strengths and weaknesses on the ratio and proportion concepts. A proportional reasoning test with 22 multiple-choice items was developed from the context of the loglinear cognitive diagnosis model. The test was developed…
Descriptors: Diagnostic Tests, Multiple Choice Tests, Test Construction, Middle School Students
Moon, Jung Aa; Keehner, Madeleine; Katz, Irvin R. – Educational Measurement: Issues and Practice, 2019
The current study investigated how item formats and their inherent affordances influence test-takers' cognition under uncertainty. Adult participants solved content-equivalent math items in multiple-selection multiple-choice and four alternative grid formats. The results indicated that participants' affirmative response tendency (i.e., judge the…
Descriptors: Affordances, Test Items, Test Format, Test Wiseness
Lindner, Marlit A.; Schult, Johannes; Mayer, Richard E. – Journal of Educational Psychology, 2022
This classroom experiment investigates the effects of adding representational pictures to multiple-choice and constructed-response test items to understand the role of the response format for the multimedia effect in testing. Participants were 575 fifth- and sixth-graders who answered 28 science test items--seven items in each of four experimental…
Descriptors: Elementary School Students, Grade 5, Grade 6, Multimedia Materials
Matsuda, Noriyuki; Ogawa, Hisashi; Hirashima, Tsukasa; Taki, Hirokazu – Research and Practice in Technology Enhanced Learning, 2015
Background: Erroneous answers in multiple-answer problems not only make the correct answer harder to determine but also indicate why the correct choice is suitable and the erroneous one a mistake when compared to the correct answer. However, it is insufficient to simply create erroneous answers for this purpose: explanations of these answers are…
Descriptors: Multiple Choice Tests, Problem Solving, Error Patterns, Student Evaluation
Haladyna, Thomas M. – IDEA Center, Inc., 2018
Writing multiple-choice test items to measure student learning in higher education is a challenge. Based on extensive scholarly research and experience, the author describes various item formats, offers guidelines for creating these items, and provides many examples of both good and bad test items. He also suggests some shortcuts for developing…
Descriptors: Test Construction, Multiple Choice Tests, Test Items, Higher Education
Van den Eynde, Sofie; van Kampen, Paul; Van Dooren, Wim; De Cock, Mieke – Physical Review Physics Education Research, 2019
We report on a study investigating the influence of context, direction of translation, and function type on undergraduate students' ability to translate between graphical and symbolic representations of mathematical relations. Students from an algebra-based and a calculus-based physics course were asked to solve multiple-choice items in which they…
Descriptors: Graphs, Equations (Mathematics), Mathematics Instruction, Physics
Attali, Yigal; Laitusis, Cara; Stone, Elizabeth – Educational and Psychological Measurement, 2016
There are many reasons to believe that open-ended (OE) and multiple-choice (MC) items elicit different cognitive demands of students. However, empirical evidence that supports this view is lacking. In this study, we investigated the reactions of test takers to an interactive assessment with immediate feedback and answer-revision opportunities for…
Descriptors: Test Items, Questioning Techniques, Differences, Student Reaction
Wickett, Maryann; Hendrix-Martin, Eunice – Stenhouse Publishers, 2011
Multiple-choice testing is an educational reality. Rather than complain about the negative impact these tests may have on teaching and learning, why not use them to better understand your students' true mathematical knowledge and comprehension? Maryann Wickett and Eunice Hendrix-Martin show teachers how to move beyond the student's answer--right…
Descriptors: Educational Strategies, Student Evaluation, Standardized Tests, Multiple Choice Tests

Peer reviewed
Direct link
