Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 1 |
| Since 2017 (last 10 years) | 4 |
| Since 2007 (last 20 years) | 5 |
Descriptor
| Difficulty Level | 6 |
| Item Response Theory | 6 |
| Multiple Choice Tests | 6 |
| National Competency Tests | 6 |
| Test Items | 6 |
| Foreign Countries | 5 |
| Accuracy | 2 |
| Reading Tests | 2 |
| Standardized Tests | 2 |
| Statistical Analysis | 2 |
| Cognitive Processes | 1 |
| More ▼ | |
Source
| International Journal of… | 2 |
| Educational and Psychological… | 1 |
| National Center for Education… | 1 |
| Pedagogical Research | 1 |
| School Psychology | 1 |
Author
| Andrich, David | 1 |
| Apino, Ezi | 1 |
| Ehrich, John | 1 |
| Garavaglia, Diane R. | 1 |
| Hadiana, Deni | 1 |
| Howard, Steven J. | 1 |
| Humphry, Stephen Mark | 1 |
| Kalkan, Ömür Kaya | 1 |
| Kara, Yusuf | 1 |
| Kelecioglu, Hülya | 1 |
| Lydiati, Ida | 1 |
| More ▼ | |
Publication Type
| Journal Articles | 5 |
| Reports - Research | 5 |
| Information Analyses | 1 |
| Reports - Evaluative | 1 |
Education Level
| Early Childhood Education | 1 |
| Elementary Education | 1 |
| Elementary Secondary Education | 1 |
| Grade 12 | 1 |
| Grade 3 | 1 |
| High Schools | 1 |
| Primary Education | 1 |
| Secondary Education | 1 |
Audience
Laws, Policies, & Programs
Assessments and Surveys
| National Assessment Program… | 1 |
| National Assessment of… | 1 |
What Works Clearinghouse Rating
Rafi, Ibnu; Retnawati, Heri; Apino, Ezi; Hadiana, Deni; Lydiati, Ida; Rosyada, Munaya Nikma – Pedagogical Research, 2023
This study describes the characteristics of the test and its items used in the national-standardized school examination by applying classical test theory and focusing on the item difficulty, item discrimination, test reliability, and distractor analysis. We analyzed response data of 191 12th graders from one of public senior high schools in…
Descriptors: Foreign Countries, National Competency Tests, Standardized Tests, Mathematics Tests
Woodcock, Stuart; Howard, Steven J.; Ehrich, John – School Psychology, 2020
Standardized testing is ubiquitous in educational assessment, but questions have been raised about the extent to which these test scores accurately reflect students' genuine knowledge and skills. To more rigorously investigate this issue, the current study employed a within-subject experimental design to examine item format effects on primary…
Descriptors: Elementary School Students, Grade 3, Test Items, Test Format
Yalçin, Seher – International Journal of Assessment Tools in Education, 2018
The purpose of this study is to determine the best IRT model [Rasch, 2PL, 3PL, 4PL and mixed IRT (2 and 3PL)] for the science and technology subtest of the Transition from Basic Education to Secondary Education (TEOG) exam, which is carried out at national level, it is also aimed to predict the item parameters under the best model. This study is a…
Descriptors: Item Response Theory, Models, Goodness of Fit, Multiple Choice Tests
Kalkan, Ömür Kaya; Kara, Yusuf; Kelecioglu, Hülya – International Journal of Assessment Tools in Education, 2018
Missing data is a common problem in datasets that are obtained by administration of educational and psychological tests. It is widely known that existence of missing observations in data can lead to serious problems such as biased parameter estimates and inflation of standard errors. Most of the missing data imputation methods are focused on…
Descriptors: Item Response Theory, Statistical Analysis, Data, Test Items
Andrich, David; Marais, Ida; Humphry, Stephen Mark – Educational and Psychological Measurement, 2016
Recent research has shown how the statistical bias in Rasch model difficulty estimates induced by guessing in multiple-choice items can be eliminated. Using vertical scaling of a high-profile national reading test, it is shown that the dominant effect of removing such bias is a nonlinear change in the unit of scale across the continuum. The…
Descriptors: Guessing (Tests), Statistical Bias, Item Response Theory, Multiple Choice Tests
Pearson, P. David; Garavaglia, Diane R. – National Center for Education Statistics, 2003
The purpose of this essay is to explore both what is known and what needs to be learned about the information value of performance items "when they are used in large scale assessments." Within the context of the National Assessment of Educational Progress (NAEP), there is substantial motivation for answering these questions. Over the…
Descriptors: Measurement, National Competency Tests, Test Items, Performance

Peer reviewed
Direct link
