NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 5 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Raykov, Tenko; Dimitrov, Dimiter M.; Marcoulides, George A.; Li, Tatyana; Menold, Natalja – Educational and Psychological Measurement, 2018
A latent variable modeling method for studying measurement invariance when evaluating latent constructs with multiple binary or binary scored items with no guessing is outlined. The approach extends the continuous indicator procedure described by Raykov and colleagues, utilizes similarly the false discovery rate approach to multiple testing, and…
Descriptors: Models, Statistical Analysis, Error of Measurement, Test Bias
Peer reviewed Peer reviewed
Direct linkDirect link
Raykov, Tenko; Dimitrov, Dimiter M.; Marcoulides, George A.; Harrison, Michael – Educational and Psychological Measurement, 2019
Building on prior research on the relationships between key concepts in item response theory and classical test theory, this note contributes to highlighting their important and useful links. A readily and widely applicable latent variable modeling procedure is discussed that can be used for point and interval estimation of the individual person…
Descriptors: True Scores, Item Response Theory, Test Items, Test Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Dimitrov, Dimiter M. – Measurement and Evaluation in Counseling and Development, 2017
This article offers an approach to examining differential item functioning (DIF) under its item response theory (IRT) treatment in the framework of confirmatory factor analysis (CFA). The approach is based on integrating IRT- and CFA-based testing of DIF and using bias-corrected bootstrap confidence intervals with a syntax code in Mplus.
Descriptors: Test Bias, Item Response Theory, Factor Analysis, Evaluation Methods
Dimitrov, Dimiter M. – 1994
An approach is described that reveals the hierarchical test structure (HTS) based on the cognitive demands of the test items, and conducts a linear trait modeling by using the HST elements as item difficulty components. This approach, referred to as the Hierarchical Latent Trait Approach (HLTA), employs an algorithm that allows all test items to…
Descriptors: Algorithms, Cognitive Processes, Difficulty Level, Higher Education
Peer reviewed Peer reviewed
Dimitrov, Dimiter M. – Mid-Western Educational Researcher, 1999
Combines item response theory (IRT) and statistical methods to analyze California Achievement Test-Mathematics (CAT-M) results for 4,135 seventh graders in northeast Ohio. Provides information to educational analysts about which IRT model fits CAT-M data for the target population, test accuracy in estimating students' abilities at different…
Descriptors: Achievement Tests, Evaluation Research, Grade 7, Item Response Theory