Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 1 |
| Since 2007 (last 20 years) | 2 |
Descriptor
| Difficulty Level | 8 |
| Educational Testing | 8 |
| Multiple Choice Tests | 8 |
| Test Items | 5 |
| Educational Assessment | 2 |
| Goodness of Fit | 2 |
| Guessing (Tests) | 2 |
| Higher Education | 2 |
| Item Response Theory | 2 |
| Statistical Analysis | 2 |
| Test Construction | 2 |
| More ▼ | |
Source
| Cogent Education | 1 |
| Contemporary Educational… | 1 |
| Journal of Experimental… | 1 |
| Practical Assessment,… | 1 |
| Psychometrika | 1 |
Author
| Arhin, Ato Kwamina | 1 |
| Gilmer, Jerry S. | 1 |
| Han, Kyung T. | 1 |
| Kelderman, Henk | 1 |
| Kumar, V. K. | 1 |
| Nickerson, Raymond S. | 1 |
| Quaigrain, Kennedy | 1 |
| Roberts, Sarah Jane | 1 |
| Weiten, Wayne | 1 |
| Westers, Paul | 1 |
Publication Type
| Reports - Research | 6 |
| Journal Articles | 5 |
| Guides - Non-Classroom | 1 |
| Opinion Papers | 1 |
| Reports - Evaluative | 1 |
| Speeches/Meeting Papers | 1 |
Education Level
| Higher Education | 1 |
Audience
Location
| Ghana | 1 |
Laws, Policies, & Programs
| Elementary and Secondary… | 1 |
Assessments and Surveys
| National Assessment of… | 1 |
What Works Clearinghouse Rating
Quaigrain, Kennedy; Arhin, Ato Kwamina – Cogent Education, 2017
Item analysis is essential in improving items which will be used again in later tests; it can also be used to eliminate misleading items in a test. The study focused on item and test quality and explored the relationship between difficulty index (p-value) and discrimination index (DI) with distractor efficiency (DE). The study was conducted among…
Descriptors: Item Analysis, Teacher Developed Materials, Test Reliability, Educational Assessment
Han, Kyung T. – Practical Assessment, Research & Evaluation, 2012
For several decades, the "three-parameter logistic model" (3PLM) has been the dominant choice for practitioners in the field of educational measurement for modeling examinees' response data from multiple-choice (MC) items. Past studies, however, have pointed out that the c-parameter of 3PLM should not be interpreted as a guessing…
Descriptors: Statistical Analysis, Models, Multiple Choice Tests, Guessing (Tests)
Peer reviewedWeiten, Wayne – Journal of Experimental Education, 1982
A comparison of double as opposed to single multiple-choice questions yielded significant differences in regard to item difficulty, item discrimination, and internal reliability, but not concurrent validity. (Author/PN)
Descriptors: Difficulty Level, Educational Testing, Higher Education, Multiple Choice Tests
Nickerson, Raymond S. – 1986
A number of higher order cognitive skills are used in the task of evaluating arguments. Such skills should be assessed because the ability to evaluate arguments is an important one in all subject areas. In addition, it seems reasonable to assume that these evaluative skills will be representative of those required by other cognitively demanding…
Descriptors: Cognitive Tests, Critical Thinking, Difficulty Level, Discourse Analysis
Roberts, Sarah Jane – 1978
This guide explains the concept of out-of-level testing and suggests a formula for estimating the occurrence of floor effects and ceiling effects, within the context of models for evaluating Elementary Secondary Education Act (ESEA) Title I programs. An analogy explains floor and ceiling effects as if test items are stored in different levels in a…
Descriptors: Achievement Tests, Difficulty Level, Educational Testing, Elementary Education
Peer reviewedKumar, V. K.; And Others – Contemporary Educational Psychology, 1979
Ninth-graders read a passage for a test to be taken the next day, anticipating a recall test, a multiple-choice test, and a retention test. Half received either a recall or a recognition test regardless of prior instructions. Subjects did better on the recognition tests in all conditions. (Author/RD)
Descriptors: Difficulty Level, Educational Testing, Expectation, Junior High Schools
Peer reviewedWesters, Paul; Kelderman, Henk – Psychometrika, 1992
A method for analyzing test-item responses is proposed to examine differential item functioning (DIF) in multiple-choice items within the latent class framework. Different models for detection of DIF are formulated, defining the subgroup as a latent variable. An efficient estimation method is described and illustrated. (SLD)
Descriptors: Chi Square, Difficulty Level, Educational Testing, Equations (Mathematics)
PDF pending restorationGilmer, Jerry S. – 1979
Sixty college students from classes in educational measurement were divided into two groups. Each group was administered the same criterion test except that one group received feedback after every item and the other received no feedback. The students were also divided into three ability levels. Each test item was classified two ways; by item…
Descriptors: Academic Ability, Answer Keys, Answer Sheets, College Students

Direct link
