Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 3 |
| Since 2007 (last 20 years) | 3 |
Descriptor
| Item Response Theory | 5 |
| Statistical Analysis | 5 |
| Computation | 2 |
| Test Bias | 2 |
| Test Items | 2 |
| Achievement Tests | 1 |
| Algorithms | 1 |
| Coding | 1 |
| Cognitive Processes | 1 |
| College Entrance Examinations | 1 |
| Difficulty Level | 1 |
| More ▼ | |
Author
| Dimitrov, Dimiter M. | 5 |
| Marcoulides, George A. | 2 |
| Raykov, Tenko | 2 |
| Harrison, Michael | 1 |
| Li, Tatyana | 1 |
| Menold, Natalja | 1 |
Publication Type
| Journal Articles | 4 |
| Reports - Research | 4 |
| Reports - Evaluative | 1 |
| Speeches/Meeting Papers | 1 |
Education Level
| Higher Education | 1 |
| Postsecondary Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
| California Achievement Tests | 1 |
| Law School Admission Test | 1 |
What Works Clearinghouse Rating
Raykov, Tenko; Dimitrov, Dimiter M.; Marcoulides, George A.; Li, Tatyana; Menold, Natalja – Educational and Psychological Measurement, 2018
A latent variable modeling method for studying measurement invariance when evaluating latent constructs with multiple binary or binary scored items with no guessing is outlined. The approach extends the continuous indicator procedure described by Raykov and colleagues, utilizes similarly the false discovery rate approach to multiple testing, and…
Descriptors: Models, Statistical Analysis, Error of Measurement, Test Bias
Raykov, Tenko; Dimitrov, Dimiter M.; Marcoulides, George A.; Harrison, Michael – Educational and Psychological Measurement, 2019
Building on prior research on the relationships between key concepts in item response theory and classical test theory, this note contributes to highlighting their important and useful links. A readily and widely applicable latent variable modeling procedure is discussed that can be used for point and interval estimation of the individual person…
Descriptors: True Scores, Item Response Theory, Test Items, Test Theory
Dimitrov, Dimiter M. – Measurement and Evaluation in Counseling and Development, 2017
This article offers an approach to examining differential item functioning (DIF) under its item response theory (IRT) treatment in the framework of confirmatory factor analysis (CFA). The approach is based on integrating IRT- and CFA-based testing of DIF and using bias-corrected bootstrap confidence intervals with a syntax code in Mplus.
Descriptors: Test Bias, Item Response Theory, Factor Analysis, Evaluation Methods
Dimitrov, Dimiter M. – 1994
An approach is described that reveals the hierarchical test structure (HTS) based on the cognitive demands of the test items, and conducts a linear trait modeling by using the HST elements as item difficulty components. This approach, referred to as the Hierarchical Latent Trait Approach (HLTA), employs an algorithm that allows all test items to…
Descriptors: Algorithms, Cognitive Processes, Difficulty Level, Higher Education
Peer reviewedDimitrov, Dimiter M. – Mid-Western Educational Researcher, 1999
Combines item response theory (IRT) and statistical methods to analyze California Achievement Test-Mathematics (CAT-M) results for 4,135 seventh graders in northeast Ohio. Provides information to educational analysts about which IRT model fits CAT-M data for the target population, test accuracy in estimating students' abilities at different…
Descriptors: Achievement Tests, Evaluation Research, Grade 7, Item Response Theory

Direct link
