Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 4 |
| Since 2007 (last 20 years) | 8 |
Descriptor
| Comparative Analysis | 8 |
| Computation | 8 |
| Item Response Theory | 5 |
| Foreign Countries | 4 |
| Test Items | 4 |
| Identification | 3 |
| Test Bias | 3 |
| Achievement Tests | 2 |
| Educational Assessment | 2 |
| Evaluation Methods | 2 |
| Goodness of Fit | 2 |
| More ▼ | |
Source
| International Journal of… | 8 |
Author
| Arce-Ferrer, Alvaro J. | 1 |
| Beland, Sebastien | 1 |
| Bulut, Okan | 1 |
| Cui, Ying | 1 |
| Gerard, Paul | 1 |
| Guo, Hongwen | 1 |
| Liu, Ou Lydia | 1 |
| Maeda, Hotaka | 1 |
| Magis, David | 1 |
| Mao, Liyang | 1 |
| Mapuranga, Raymond | 1 |
| More ▼ | |
Publication Type
| Journal Articles | 8 |
| Reports - Research | 6 |
| Reports - Descriptive | 1 |
| Reports - Evaluative | 1 |
Education Level
| Grade 4 | 1 |
| Grade 9 | 1 |
| High Schools | 1 |
| Higher Education | 1 |
| Junior High Schools | 1 |
| Middle Schools | 1 |
| Secondary Education | 1 |
Audience
Laws, Policies, & Programs
Assessments and Surveys
| Program for International… | 1 |
| Progress in International… | 1 |
| Trends in International… | 1 |
What Works Clearinghouse Rating
Rios, Joseph A.; Guo, Hongwen; Mao, Liyang; Liu, Ou Lydia – International Journal of Testing, 2017
When examinees' test-taking motivation is questionable, practitioners must determine whether careless responding is of practical concern and if so, decide on the best approach to filter such responses. As there has been insufficient research on these topics, the objectives of this study were to: a) evaluate the degree of underestimation in the…
Descriptors: Response Style (Tests), Scores, Motivation, Computation
Arce-Ferrer, Alvaro J.; Bulut, Okan – International Journal of Testing, 2017
This study examines separate and concurrent approaches to combine the detection of item parameter drift (IPD) and the estimation of scale transformation coefficients in the context of the common item nonequivalent groups design with the three-parameter item response theory equating. The study uses real and synthetic data sets to compare the two…
Descriptors: Item Response Theory, Equated Scores, Identification, Computation
Maeda, Hotaka; Zhang, Bo – International Journal of Testing, 2017
The omega (?) statistic is reputed to be one of the best indices for detecting answer copying on multiple choice tests, but its performance relies on the accurate estimation of copier ability, which is challenging because responses from the copiers may have been contaminated. We propose an algorithm that aims to identify and delete the suspected…
Descriptors: Cheating, Test Items, Mathematics, Statistics
Sen, Sedat – International Journal of Testing, 2018
Recent research has shown that over-extraction of latent classes can be observed in the Bayesian estimation of the mixed Rasch model when the distribution of ability is non-normal. This study examined the effect of non-normal ability distributions on the number of latent classes in the mixed Rasch model when estimated with maximum likelihood…
Descriptors: Item Response Theory, Comparative Analysis, Computation, Maximum Likelihood Statistics
Oliveri, Maria Elena; von Davier, Matthias – International Journal of Testing, 2014
In this article, we investigate the creation of comparable score scales across countries in international assessments. We examine potential improvements to current score scale calibration procedures used in international large-scale assessments. Our approach seeks to improve fairness in scoring international large-scale assessments, which often…
Descriptors: Test Bias, Scores, International Programs, Educational Assessment
Cui, Ying; Mousavi, Amin – International Journal of Testing, 2015
The current study applied the person-fit statistic, l[subscript z], to data from a Canadian provincial achievement test to explore the usefulness of conducting person-fit analysis on large-scale assessments. Item parameter estimates were compared before and after the misfitting student responses, as identified by l[subscript z], were removed. The…
Descriptors: Measurement, Achievement Tests, Comparative Analysis, Test Items
Magis, David; Raiche, Gilles; Beland, Sebastien; Gerard, Paul – International Journal of Testing, 2011
We present an extension of the logistic regression procedure to identify dichotomous differential item functioning (DIF) in the presence of more than two groups of respondents. Starting from the usual framework of a single focal group, we propose a general approach to estimate the item response functions in each group and to test for the presence…
Descriptors: Language Skills, Identification, Foreign Countries, Evaluation Methods
Wyse, Adam E.; Mapuranga, Raymond – International Journal of Testing, 2009
Differential item functioning (DIF) analysis is a statistical technique used for ensuring the equity and fairness of educational assessments. This study formulates a new DIF analysis method using the information similarity index (ISI). ISI compares item information functions when data fits the Rasch model. Through simulations and an international…
Descriptors: Test Bias, Evaluation Methods, Test Items, Educational Assessment

Peer reviewed
Direct link
