ERIC Number: ED614019
Record Type: Non-Journal
Publication Date: 2018
Pages: 16
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-0027-3171
EISSN: N/A
Available Date: N/A
Model Fit and Item Factor Analysis: Overfactoring, Underfactoring, and a Program to Guide Interpretation
Clark, D. Angus; Bowles, Ryan P.
Grantee Submission, Multivariate Behavioral Research v53 n4 p544-558 2018
In exploratory item factor analysis (IFA), researchers may use model fit statistics and commonly invoked fit thresholds to help determine the dimensionality of an assessment. However, these indices and thresholds may mislead as they were developed in a confirmatory framework for models with continuous, not categorical, indicators. The present study used Monte Carlo simulation methods to investigate the ability of popular model fit statistics (chi-square, root mean square error of approximation, the comparative fit index, and the Tucker-Lewis index) and their standard cutoff values to detect the optimal number of latent dimensions underlying sets of dichotomous items. Models were fit to data generated from three-factor population structures that varied in factor loading magnitude, factor intercorrelation magnitude, number of indicators, and whether cross loadings or minor factors were included. The effectiveness of the thresholds varied across fit statistics, and was conditional on many features of the underlying model. Together, results suggest that conventional fit thresholds offer questionable utility in the context of IFA.
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: Institute of Education Sciences (ED)
Authoring Institution: N/A
IES Funded: Yes
Grant or Contract Numbers: R305A110293; R324A150063
Author Affiliations: N/A