NotesFAQContact Us
Collection
Advanced
Search Tips
Showing 1 to 15 of 21 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Cho, Sun-Joo; Bottge, Brian A.; Cohen, Allan S.; Kim, Seock-Ho – Journal of Special Education, 2011
Current methods for detecting growth of students' problem-solving skills in math focus mainly on analyzing changes in test scores. Score-level analysis, however, may fail to reflect subtle changes that might be evident at the item level. This article demonstrates a method for studying item-level changes using data from a multiwave experiment with…
Descriptors: Test Bias, Group Membership, Mathematics Skills, Ability
Peer reviewed Peer reviewed
Direct linkDirect link
Kim, Seock-Ho; Cohen, Allan S.; Alagoz, Cigdem; Kim, Sukwoo – Journal of Educational Measurement, 2007
Data from a large-scale performance assessment (N = 105,731) were analyzed with five differential item functioning (DIF) detection methods for polytomous items to examine the congruence among the DIF detection methods. Two different versions of the item response theory (IRT) model-based likelihood ratio test, the logistic regression likelihood…
Descriptors: Performance Based Assessment, Performance Tests, Item Response Theory, Test Bias
Kim, Seock-Ho; Cohen, Allan S.; DiStefano, Christine A.; Kim, Sooyeon – 1998
Type I error rates of the likelihood ratio test for the detection of differential item functioning (DIF) in the partial credit model were investigated using simulated data. The partial credit model with four ordered performance levels was used to generate data sets of a 30-item test for samples of 300 and 1,000 simulated examinees. Three different…
Descriptors: Item Bias, Simulation, Test Items
De Ayala, R. J.; Kim, Seock-Ho; Stapleton, Laura M.; Dayton, C. Mitchell – 1999
Differential item functioning (DIF) may be defined as an item that displays different statistical properties for different groups after the groups are matched on an ability measure. For instance, with binary data, DIF exists when there is a difference in the conditional probabilities of a correct response for two manifest groups. This paper…
Descriptors: Item Bias, Monte Carlo Methods, Test Items
Peer reviewed Peer reviewed
De Ayala, Ralph J.; Kim, Seock-Ho; Stapleton, Laura M.; Dayton, C. Mitchell – International Journal of Testing, 2002
Conducted a Monte Carlo study to compare various approaches to detecting differential item functioning (DIF) under a conceptualization of DIF that recognizes that observed data are a mixture of data from multiple latent populations or classes. Demonstrated the usefulness of the approach. (SLD)
Descriptors: Data Analysis, Item Bias, Monte Carlo Methods, Simulation
Peer reviewed Peer reviewed
Kim, Seock-Ho; Cohen, Allan S. – Applied Psychological Measurement, 1991
The exact and closed-interval area measures for detecting differential item functioning are compared for actual data from 1,000 African-American and 1,000 white college students taking a vocabulary test with items intentionally constructed to favor 1 set of examinees. No real differences in detection of biased items were found. (SLD)
Descriptors: Black Students, College Students, Comparative Testing, Equations (Mathematics)
Cohen, Allan S.; Kim, Seock-Ho; Wollack, James A. – 1998
This paper provides a review of procedures for detection of differential item functioning (DIF) for item response theory (IRT) and observed score methods for the graded response model. In addition, data from a test anxiety scale were analyzed to examine the congruence among these procedures. Data from Nasser, Takahashi, and Benson (1997) were…
Descriptors: Identification, Item Bias, Item Response Theory, Scores
Peer reviewed Peer reviewed
Kim, Seock-Ho; Cohen, Allan S. – Applied Psychological Measurement, 1998
Investigated Type I error rates of the likelihood-ratio test for the detection of differential item functioning (DIF) using Monte Carlo simulations under the graded-response model. Type I error rates were within theoretically expected values for all six combinations of sample sizes and ability-matching conditions at each of the nominal alpha…
Descriptors: Ability, Item Bias, Item Response Theory, Monte Carlo Methods
Kim, Seock-Ho – 1997
Hierarchical Bayes procedures for the two-parameter logistic item response model were compared for estimating item parameters. Simulated data sets were analyzed using two different Bayes estimation procedures, the two-stage hierarchical Bayes estimation (HB2) and the marginal Bayesian with known hyperparameters (MB), and marginal maximum…
Descriptors: Bayesian Statistics, Difficulty Level, Estimation (Mathematics), Item Bias
Kim, Seock-Ho – 2000
This paper is concerned with statistical issues in differential item functioning (DIF). Four subsets of large scale performance assessment data from the Georgia Kindergarten Assessment Program-Revised (N=105,731; N=10,000; N=1,00; and N=100) were analyzed using three DIF detection methods for polytomous items to examine the congruence among the…
Descriptors: Item Bias, Item Response Theory, Kindergarten, Performance Based Assessment
Peer reviewed Peer reviewed
Cohen, Allan S.; Kim, Seock-Ho – Applied Psychological Measurement, 1993
The effectiveness of two statistical tests of the area between item response functions (exact signed area and exact unsigned area) estimated in different samples, a measure of differential item functioning (DIF), was compared with Lord's chi square. Lord's chi square was found the most effective in determining DIF. (SLD)
Descriptors: Chi Square, Comparative Analysis, Equations (Mathematics), Estimation (Mathematics)
Kim, Seock-Ho; Cohen, Allan S. – 1997
Applications of item response theory to practical testing problems including equating, differential item functioning, and computerized adaptive testing, require that item parameter estimates be placed onto a common metric. In this study, two methods for developing a common metric for the graded response model under item response theory were…
Descriptors: Adaptive Testing, Comparative Analysis, Computer Assisted Testing, Equated Scores
Peer reviewed Peer reviewed
Kim, Seock-Ho; And Others – Journal of Educational Measurement, 1995
A method is presented for detection of differential item functioning in multiple groups. This method is closely related to F. M. Lord's chi square for comparing vectors of item parameters estimated in two groups. An example is provided using data from 600 college students taking a mathematics test with and without calculators. (SLD)
Descriptors: Chi Square, College Students, Comparative Analysis, Estimation (Mathematics)
Kim, Seock-Ho; Cohen, Allan S. – 1997
Type I error rates of the likelihood ratio test for the detection of differential item functioning (DIF) were investigated using Monte Carlo simulations. The graded response model with five ordered categories was used to generate data sets of a 30-item test for samples of 300 and 1,000 simulated examinees. All DIF comparisons were simulated by…
Descriptors: Ability, Classification, Computer Simulation, Estimation (Mathematics)
Peer reviewed Peer reviewed
Kim, Seock-Ho; Cohen, Allan S. – Applied Measurement in Education, 1995
Three procedures for the detection of differential item functioning under item response theory were compared. Data for 2 forms of a mathematics test taken by 1,490 college students were analyzed through F. M. Lord's chi-square, N. S. Raju's area measures, and the likelihood ratio test. (SLD)
Descriptors: Chi Square, College Students, Comparative Analysis, Higher Education
Previous Page | Next Page ยป
Pages: 1  |  2