NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: ED640042
Record Type: Non-Journal
Publication Date: 2023
Pages: 50
Abstractor: As Provided
ISBN: N/A
ISSN: N/A
EISSN: N/A
Available Date: N/A
Dynamic Fit Index Cutoffs for Categorical Factor Analysis with Likert-Type, Ordinal, or Binary Responses
Daniel McNeish
Grantee Submission
Scale validation is vital to psychological research because it ensures that scores from measurement scales represent the intended construct. Factor analysis fit indices are commonly used to provide quantitative evidence that a proposed factor structure is plausible. However, there is mismatch between guidelines for evaluating fit of factor models and the data that most researchers have. Namely, fit guidelines are based on simulations assuming item responses are collected on a continuous scale whereas most researchers collect discrete responses such as with a Likert-type scale. In this paper, we show that common guidelines derived from assuming continuous responses (e.g., RMSEA < 0.06, CFI > 0.95) do not generalize to factor models applied to discrete responses. Specifically, discrete responses provide less information than continuous responses, so less information about misfit is passed to fit indices. Traditional guidelines therefore end up being too lenient and lose their ability to identify that a model may have poor fit. We provide one possible solution by extending the recently developed Dynamic Fit Index framework to accommodate discrete responses common in psychology. We conduct a simulation study to provide evidence that the proposed method consistently distinguishes between well-fitting and poorly-fitting models. Results showed that our proposed cutoffs maintained at least 90% sensitivity to misspecification accuracy across studied conditions whereas traditional cutoffs were highly inconsistent and frequently exhibited sensitivity below 50%. The proposed method is included in the dynamic R package and as a web-based Shiny application to make it easily accessible to psychologists. [This paper was published in "American Psychologist" v79 n9 p1061-1075 2023.]
Publication Type: Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: Institute of Education Sciences (ED)
Authoring Institution: N/A
IES Funded: Yes
Grant or Contract Numbers: R305D220003
Author Affiliations: N/A