Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 3 |
| Since 2017 (last 10 years) | 3 |
| Since 2007 (last 20 years) | 3 |
Descriptor
| Guidelines | 3 |
| Item Analysis | 3 |
| Test Items | 3 |
| Test Length | 3 |
| Comparative Analysis | 2 |
| Error of Measurement | 2 |
| Item Response Theory | 2 |
| Sample Size | 2 |
| Simulation | 2 |
| Accuracy | 1 |
| Correlation | 1 |
| More ▼ | |
Author
| Choi, Youn-Jeng | 1 |
| Goodrich, J. Marc | 1 |
| Guo, Wenjing | 1 |
| Huang, Feifei | 1 |
| Koziol, Natalie A. | 1 |
| Lee, Won-Chan | 1 |
| Li, Yixing | 1 |
| Li, Zonglong | 1 |
| Wang, Shaojie | 1 |
| Yoon, HyeonJin | 1 |
| Yu, Sufang | 1 |
| More ▼ | |
Publication Type
| Journal Articles | 3 |
| Reports - Research | 3 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Guo, Wenjing; Choi, Youn-Jeng – Educational and Psychological Measurement, 2023
Determining the number of dimensions is extremely important in applying item response theory (IRT) models to data. Traditional and revised parallel analyses have been proposed within the factor analysis framework, and both have shown some promise in assessing dimensionality. However, their performance in the IRT framework has not been…
Descriptors: Item Response Theory, Evaluation Methods, Factor Analysis, Guidelines
Wang, Shaojie; Zhang, Minqiang; Lee, Won-Chan; Huang, Feifei; Li, Zonglong; Li, Yixing; Yu, Sufang – Journal of Educational Measurement, 2022
Traditional IRT characteristic curve linking methods ignore parameter estimation errors, which may undermine the accuracy of estimated linking constants. Two new linking methods are proposed that take into account parameter estimation errors. The item- (IWCC) and test-information-weighted characteristic curve (TWCC) methods employ weighting…
Descriptors: Item Response Theory, Error of Measurement, Accuracy, Monte Carlo Methods
Koziol, Natalie A.; Goodrich, J. Marc; Yoon, HyeonJin – Educational and Psychological Measurement, 2022
Differential item functioning (DIF) is often used to examine validity evidence of alternate form test accommodations. Unfortunately, traditional approaches for evaluating DIF are prone to selection bias. This article proposes a novel DIF framework that capitalizes on regression discontinuity design analysis to control for selection bias. A…
Descriptors: Regression (Statistics), Item Analysis, Validity, Testing Accommodations

Peer reviewed
Direct link
