Publication Date
| In 2026 | 0 |
| Since 2025 | 1 |
| Since 2022 (last 5 years) | 9 |
| Since 2017 (last 10 years) | 13 |
| Since 2007 (last 20 years) | 19 |
Descriptor
| Item Analysis | 26 |
| Mathematics Tests | 26 |
| Multiple Choice Tests | 26 |
| Test Items | 23 |
| Foreign Countries | 10 |
| Test Format | 10 |
| Test Construction | 8 |
| Comparative Analysis | 6 |
| Difficulty Level | 6 |
| Scores | 6 |
| Standardized Tests | 6 |
| More ▼ | |
Source
Author
Publication Type
| Reports - Research | 21 |
| Journal Articles | 16 |
| Speeches/Meeting Papers | 4 |
| Reports - Evaluative | 2 |
| Dissertations/Theses -… | 1 |
| Reports - Descriptive | 1 |
| Tests/Questionnaires | 1 |
Education Level
| Secondary Education | 9 |
| Elementary Education | 5 |
| High Schools | 4 |
| Higher Education | 4 |
| Elementary Secondary Education | 3 |
| Grade 12 | 3 |
| Grade 4 | 3 |
| Grade 8 | 3 |
| Postsecondary Education | 3 |
| Grade 10 | 2 |
| Intermediate Grades | 2 |
| More ▼ | |
Audience
| Practitioners | 1 |
| Researchers | 1 |
| Teachers | 1 |
Laws, Policies, & Programs
Assessments and Surveys
| ACT Assessment | 2 |
| National Assessment of… | 2 |
| SAT (College Admission Test) | 2 |
| Praxis Series | 1 |
| Pre Professional Skills Tests | 1 |
| Program for International… | 1 |
What Works Clearinghouse Rating
Katrin Klingbeil; Fabian Rösken; Bärbel Barzel; Florian Schacht; Kaye Stacey; Vicki Steinle; Daniel Thurm – ZDM: Mathematics Education, 2024
Assessing students' (mis)conceptions is a challenging task for teachers as well as for researchers. While individual assessment, for example through interviews, can provide deep insights into students' thinking, this is very time-consuming and therefore not feasible for whole classes or even larger settings. For those settings, automatically…
Descriptors: Multiple Choice Tests, Formative Evaluation, Mathematics Tests, Misconceptions
Alicia A. Stoltenberg – ProQuest LLC, 2024
Multiple-select multiple-choice items, or multiple-choice items with more than one correct answer, are used to quickly assess content on standardized assessments. Because there are multiple keys to these item types, there are also multiple ways to score student responses to these items. The purpose of this study was to investigate how changing the…
Descriptors: Scoring, Evaluation Methods, Multiple Choice Tests, Standardized Tests
Laeli, Cos Ma'arif Hidayatul; Gunarhadi; Muzzazinah – Pegem Journal of Education and Instruction, 2023
The goal of this study was to determine how the 3 Tier Multiple Choice Diagnostic Test developed in primary students' scientific understanding. This research and development includes fundamental studies, model development, and model testing. 161 fourth-graders served as research subjects. Tests, surveys, and observations are all used to collect…
Descriptors: Multiple Choice Tests, Elementary School Students, Science Instruction, Misconceptions
Congning Ni; Bhashithe Abeysinghe; Juanita Hicks – International Electronic Journal of Elementary Education, 2025
The National Assessment of Educational Progress (NAEP), often referred to as The Nation's Report Card, offers a window into the state of U.S. K-12 education system. Since 2017, NAEP has transitioned to digital assessments, opening new research opportunities that were previously impossible. Process data tracks students' interactions with the…
Descriptors: Reaction Time, Multiple Choice Tests, Behavior Change, National Competency Tests
Musa Adekunle Ayanwale – Discover Education, 2023
Examination scores obtained by students from the West African Examinations Council (WAEC), and National Business and Technical Examinations Board (NABTEB) may not be directly comparable due to differences in examination administration, item characteristics of the subject in question, and student abilities. For more accurate comparisons, scores…
Descriptors: Equated Scores, Mathematics Tests, Test Items, Test Format
Shear, Benjamin R. – Journal of Educational Measurement, 2023
Large-scale standardized tests are regularly used to measure student achievement overall and for student subgroups. These uses assume tests provide comparable measures of outcomes across student subgroups, but prior research suggests score comparisons across gender groups may be complicated by the type of test items used. This paper presents…
Descriptors: Gender Bias, Item Analysis, Test Items, Achievement Tests
PaaBen, Benjamin; Dywel, Malwina; Fleckenstein, Melanie; Pinkwart, Niels – International Educational Data Mining Society, 2022
Item response theory (IRT) is a popular method to infer student abilities and item difficulties from observed test responses. However, IRT struggles with two challenges: How to map items to skills if multiple skills are present? And how to infer the ability of new students that have not been part of the training data? Inspired by recent advances…
Descriptors: Item Response Theory, Test Items, Item Analysis, Inferences
Ferretti, Federica; Bolondi, Giorgio – International Journal of Mathematical Education in Science and Technology, 2021
In the frame of national Italian standardized assessments in mathematics, a didactic phenomenon has been observed concerning students' behaviour when answering an item: students are inclined towards not accepting an output which is not clearly recognizable with something distinct from the starting inputs. The observed effect seems to originate…
Descriptors: Standardized Tests, Mathematics Tests, Teacher Student Relationship, Foreign Countries
ALKursheh, Taha Okleh; Al-zboon, Habis Saad; AlNasraween, Mo'en Salman – International Journal of Instruction, 2022
This study aimed at comparing the effect of two test item formats (multiple-choice and complete) on estimating person's ability, item parameters and the test information function (TIF).To achieve the aim of the study, two format of mathematics(1) test have been created: multiple-choice and complete, In its final format consisted of (31) items. The…
Descriptors: Comparative Analysis, Test Items, Item Response Theory, Test Format
Rafi, Ibnu; Retnawati, Heri; Apino, Ezi; Hadiana, Deni; Lydiati, Ida; Rosyada, Munaya Nikma – Pedagogical Research, 2023
This study describes the characteristics of the test and its items used in the national-standardized school examination by applying classical test theory and focusing on the item difficulty, item discrimination, test reliability, and distractor analysis. We analyzed response data of 191 12th graders from one of public senior high schools in…
Descriptors: Foreign Countries, National Competency Tests, Standardized Tests, Mathematics Tests
Carli, Marta; Lippiello, Stefania; Pantano, Ornella; Perona, Mario; Tormen, Giuseppe – Physical Review Physics Education Research, 2020
In this article, we discuss the development and the administration of a multiple-choice test, which we named "Test of Calculus and Vectors in Mathematics and Physics" (TCV-MP), aimed at comparing students' ability to answer questions on derivatives, integrals, and vectors in a purely mathematical context and in the context of physics.…
Descriptors: Mathematics Tests, Science Tests, Multiple Choice Tests, Calculus
Büyükturan, Esin Bagcan; Sireci, Ayse – Journal of Education and Training Studies, 2018
Item discrimination index, which indicates the ability of the item to distinguish whether or not the individuals have acquired the qualities that are evaluated, is basically a validity measure and it is estimated by examining the fit between item score and the test score. Based on the definition of item discrimination index, classroom observation…
Descriptors: Foreign Countries, Classroom Observation Techniques, Scores, Test Items
National Assessment Governing Board, 2017
The National Assessment of Educational Progress (NAEP) is the only continuing and nationally representative measure of trends in academic achievement of U.S. elementary and secondary school students in various subjects. For more than four decades, NAEP assessments have been conducted periodically in reading, mathematics, science, writing, U.S.…
Descriptors: Mathematics Achievement, Multiple Choice Tests, National Competency Tests, Educational Trends
Taylor, Catherine S.; Lee, Yoonsun – Applied Measurement in Education, 2012
This was a study of differential item functioning (DIF) for grades 4, 7, and 10 reading and mathematics items from state criterion-referenced tests. The tests were composed of multiple-choice and constructed-response items. Gender DIF was investigated using POLYSIBTEST and a Rasch procedure. The Rasch procedure flagged more items for DIF than did…
Descriptors: Test Bias, Gender Differences, Reading Tests, Mathematics Tests
Moses, Tim; Liu, Jinghua; Tan, Adele; Deng, Weiling; Dorans, Neil J. – ETS Research Report Series, 2013
In this study, differential item functioning (DIF) methods utilizing 14 different matching variables were applied to assess DIF in the constructed-response (CR) items from 6 forms of 3 mixed-format tests. Results suggested that the methods might produce distinct patterns of DIF results for different tests and testing programs, in that the DIF…
Descriptors: Test Construction, Multiple Choice Tests, Test Items, Item Analysis
Previous Page | Next Page »
Pages: 1 | 2
Peer reviewed
Direct link
