Publication Date
| In 2026 | 0 |
| Since 2025 | 1 |
| Since 2022 (last 5 years) | 6 |
| Since 2017 (last 10 years) | 18 |
| Since 2007 (last 20 years) | 29 |
Descriptor
| Difficulty Level | 35 |
| Multiple Choice Tests | 35 |
| Psychometrics | 35 |
| Test Items | 27 |
| Foreign Countries | 14 |
| Item Response Theory | 14 |
| Test Reliability | 13 |
| Test Construction | 12 |
| Test Validity | 12 |
| Item Analysis | 9 |
| Science Tests | 7 |
| More ▼ | |
Source
Author
| Bauer, Daniel | 2 |
| Fischer, Martin R. | 2 |
| Katz, Irvin R. | 2 |
| Martinez, Michael E. | 2 |
| Adeleke, A. A. | 1 |
| Aiman Mohammad Freihat | 1 |
| Alonzo, Julie | 1 |
| Apino, Ezi | 1 |
| Atalmis, Erkan Hasan | 1 |
| Barniol, Pablo | 1 |
| Berenbon, Rebecca F. | 1 |
| More ▼ | |
Publication Type
| Journal Articles | 29 |
| Reports - Research | 29 |
| Reports - Descriptive | 4 |
| Speeches/Meeting Papers | 3 |
| Tests/Questionnaires | 2 |
| Information Analyses | 1 |
| Numerical/Quantitative Data | 1 |
| Reports - Evaluative | 1 |
Education Level
| Secondary Education | 9 |
| Higher Education | 8 |
| Elementary Education | 7 |
| Postsecondary Education | 7 |
| High Schools | 4 |
| Middle Schools | 4 |
| Grade 5 | 3 |
| Grade 1 | 2 |
| Grade 12 | 2 |
| Grade 2 | 2 |
| Intermediate Grades | 2 |
| More ▼ | |
Audience
| Teachers | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Aiman Mohammad Freihat; Omar Saleh Bani Yassin – Educational Process: International Journal, 2025
Background/purpose: This study aimed to reveal the accuracy of estimation of multiple-choice test items parameters following the models of the item-response theory in measurement. Materials/methods: The researchers depended on the measurement accuracy indicators, which express the absolute difference between the estimated and actual values of the…
Descriptors: Accuracy, Computation, Multiple Choice Tests, Test Items
Berenbon, Rebecca F.; McHugh, Bridget C. – Educational Measurement: Issues and Practice, 2023
To assemble a high-quality test, psychometricians rely on subject matter experts (SMEs) to write high-quality items. However, SMEs are not typically given the opportunity to provide input on which content standards are most suitable for multiple-choice questions (MCQs). In the present study, we explored the relationship between perceived MCQ…
Descriptors: Test Items, Multiple Choice Tests, Standards, Difficulty Level
Musa Adekunle Ayanwale – Discover Education, 2023
Examination scores obtained by students from the West African Examinations Council (WAEC), and National Business and Technical Examinations Board (NABTEB) may not be directly comparable due to differences in examination administration, item characteristics of the subject in question, and student abilities. For more accurate comparisons, scores…
Descriptors: Equated Scores, Mathematics Tests, Test Items, Test Format
Fadillah, Sarah Meilani; Ha, Minsu; Nuraeni, Eni; Indriyanti, Nurma Yunita – Malaysian Journal of Learning and Instruction, 2023
Purpose: Researchers discovered that when students were given the opportunity to change their answers, a majority changed their responses from incorrect to correct, and this change often increased the overall test results. What prompts students to modify their answers? This study aims to examine the modification of scientific reasoning test, with…
Descriptors: Science Tests, Multiple Choice Tests, Test Items, Decision Making
Slepkov, A. D.; Van Bussel, M. L.; Fitze, K. M.; Burr, W. S. – SAGE Open, 2021
There is a broad literature in multiple-choice test development, both in terms of item-writing guidelines, and psychometric functionality as a measurement tool. However, most of the published literature concerns multiple-choice testing in the context of expert-designed high-stakes standardized assessments, with little attention being paid to the…
Descriptors: Foreign Countries, Undergraduate Students, Student Evaluation, Multiple Choice Tests
Oguguo, Basil; Lotobi, Regina Awele – European Journal of Educational Sciences, 2019
This study determined the psychometric properties of the examination items in 2011 Basic Education Certificate Examination for Basic Science. The design adopted was survey research design. The instrument for data collection was the 2011 Delta State Basic Education Certificate Examination (BECE) in Basic Science Multiple Choice Test Items. The IRT…
Descriptors: Foreign Countries, Science Tests, Item Response Theory, Psychometrics
Guven Demir, Elif; Öksuz, Yücel – Participatory Educational Research, 2022
This research aimed to investigate animation-based achievement tests according to the item format, psychometric features, students' performance, and gender. The study sample consisted of 52 fifth-grade students in Samsun/Turkey in 2017-2018. Measures of the research were open-ended (OE), animation-based open-ended (AOE), multiple-choice (MC), and…
Descriptors: Animation, Achievement Tests, Test Items, Psychometrics
Rafi, Ibnu; Retnawati, Heri; Apino, Ezi; Hadiana, Deni; Lydiati, Ida; Rosyada, Munaya Nikma – Pedagogical Research, 2023
This study describes the characteristics of the test and its items used in the national-standardized school examination by applying classical test theory and focusing on the item difficulty, item discrimination, test reliability, and distractor analysis. We analyzed response data of 191 12th graders from one of public senior high schools in…
Descriptors: Foreign Countries, National Competency Tests, Standardized Tests, Mathematics Tests
Guo, Hongwen; Zu, Jiyun; Kyllonen, Patrick – ETS Research Report Series, 2018
For a multiple-choice test under development or redesign, it is important to choose the optimal number of options per item so that the test possesses the desired psychometric properties. On the basis of available data for a multiple-choice assessment with 8 options, we evaluated the effects of changing the number of options on test properties…
Descriptors: Multiple Choice Tests, Test Items, Simulation, Test Construction
Ezechukwu, Roseline Ifeoma; Chinecherem, Basil; Oguguo, E.; Ene, Catherine U.; Ugorji, Clifford O. – World Journal of Education, 2020
This study determined the psychometric properties of the Economics Achievement Test (EAT) using Item Response Theory (IRT). Two popular IRT models namely, one-parameter logistics (1PL) and two-parameter logistics (2PL) models were utilized. The researcher adopted instrumentation research design. Four research questions and two hypotheses were…
Descriptors: Economics Education, Economics, Achievement Tests, Psychometrics
Atalmis, Erkan Hasan – International Journal of Assessment Tools in Education, 2018
Although multiple-choice items (MCIs) are widely used for classroom assessment, designing MCIs with sufficient number of plausible distracters is very challenging for teachers. In this regard, previous empirical studies reveal that using three-option MCIs provides various advantages when compared to four-option MCIs due to less preparation and…
Descriptors: Multiple Choice Tests, Test Items, Difficulty Level, Test Reliability
Lahner, Felicitas-Maria; Lörwald, Andrea Carolin; Bauer, Daniel; Nouns, Zineb Miriam; Krebs, René; Guttormsen, Sissel; Fischer, Martin R.; Huwendiek, Sören – Advances in Health Sciences Education, 2018
Multiple true-false (MTF) items are a widely used supplement to the commonly used single-best answer (Type A) multiple choice format. However, an optimal scoring algorithm for MTF items has not yet been established, as existing studies yielded conflicting results. Therefore, this study analyzes two questions: What is the optimal scoring algorithm…
Descriptors: Scoring Formulas, Scoring Rubrics, Objective Tests, Multiple Choice Tests
Relkin, Emily; de Ruiter, Laura; Bers, Marina Umaschi – Journal of Science Education and Technology, 2020
There is a need for developmentally appropriate Computational Thinking (CT) assessments that can be implemented in early childhood classrooms. We developed a new instrument called "TechCheck" for assessing CT skills in young children that does not require prior knowledge of computer programming. "TechCheck" is based on…
Descriptors: Developmentally Appropriate Practices, Computation, Thinking Skills, Early Childhood Education
Gierl, Mark J.; Bulut, Okan; Guo, Qi; Zhang, Xinxin – Review of Educational Research, 2017
Multiple-choice testing is considered one of the most effective and enduring forms of educational assessment that remains in practice today. This study presents a comprehensive review of the literature on multiple-choice testing in education focused, specifically, on the development, analysis, and use of the incorrect options, which are also…
Descriptors: Multiple Choice Tests, Difficulty Level, Accuracy, Error Patterns
Szulewski, Adam; Gegenfurtner, Andreas; Howes, Daniel W.; Sivilotti, Marco L. A.; van Merriënboer, Jeroen J. G. – Advances in Health Sciences Education, 2017
In general, researchers attempt to quantify cognitive load using physiologic and psychometric measures. Although the construct measured by both of these metrics is thought to represent overall cognitive load, there is a paucity of studies that compares these techniques to one another. The authors compared data obtained from one physiologic tool…
Descriptors: Physicians, Cognitive Processes, Difficulty Level, Physiology

Peer reviewed
Direct link
