Publication Date
| In 2026 | 0 |
| Since 2025 | 1 |
| Since 2022 (last 5 years) | 3 |
| Since 2017 (last 10 years) | 5 |
| Since 2007 (last 20 years) | 12 |
Descriptor
| Item Response Theory | 19 |
| Matrices | 19 |
| Simulation | 19 |
| Goodness of Fit | 7 |
| Models | 7 |
| Sample Size | 6 |
| Test Items | 6 |
| Computation | 5 |
| Q Methodology | 4 |
| Statistical Analysis | 4 |
| Test Length | 4 |
| More ▼ | |
Source
Author
Publication Type
| Journal Articles | 11 |
| Reports - Research | 10 |
| Reports - Evaluative | 7 |
| Speeches/Meeting Papers | 2 |
| Collected Works - Proceedings | 1 |
| Dissertations/Theses -… | 1 |
Education Level
| Elementary Education | 3 |
| Grade 6 | 1 |
| Higher Education | 1 |
| Intermediate Grades | 1 |
| Junior High Schools | 1 |
| Middle Schools | 1 |
| Postsecondary Education | 1 |
| Secondary Education | 1 |
Audience
Laws, Policies, & Programs
Assessments and Surveys
| Law School Admission Test | 4 |
| Program for International… | 2 |
What Works Clearinghouse Rating
Marcelo Andrade da Silva; A. Corinne Huggins-Manley; Jorge Luis Bazan; Amber Benedict – Grantee Submission, 2024
A Q-matrix is a binary matrix that defines the relationship between items and latent variables and is widely used in diagnostic classification models (DCMs), and can also be adopted in multidimensional item response theory (MIRT) models. The construction process of the Q-matrix is typically carried out by experts in the subject area of the items…
Descriptors: Q Methodology, Matrices, Item Response Theory, Educational Assessment
Marcelo Andrade da Silva; A. Corinne Huggins-Manley; Jorge Luis Bazán; Amber Benedict – Applied Measurement in Education, 2024
A Q-matrix is a binary matrix that defines the relationship between items and latent variables and is widely used in diagnostic classification models (DCMs), and can also be adopted in multidimensional item response theory (MIRT) models. The construction process of the Q-matrix is typically carried out by experts in the subject area of the items…
Descriptors: Q Methodology, Matrices, Item Response Theory, Educational Assessment
Boris Forthmann; Benjamin Goecke; Roger E. Beaty – Creativity Research Journal, 2025
Human ratings are ubiquitous in creativity research. Yet, the process of rating responses to creativity tasks -- typically several hundred or thousands of responses, per rater -- is often time-consuming and expensive. Planned missing data designs, where raters only rate a subset of the total number of responses, have been recently proposed as one…
Descriptors: Creativity, Research, Researchers, Research Methodology
Harel, Daphna; Steele, Russell J. – Journal of Educational and Behavioral Statistics, 2018
Collapsing categories is a commonly used data reduction technique; however, to date there do not exist principled methods to determine whether collapsing categories is appropriate in practice. With ordinal responses under the partial credit model, when collapsing categories, the true model for the collapsed data is no longer a partial credit…
Descriptors: Matrices, Models, Item Response Theory, Research Methodology
Sinharay, Sandip – Grantee Submission, 2018
Tatsuoka (1984) suggested several extended caution indices and their standardized versions that have been used as person-fit statistics by researchers such as Drasgow, Levine, and McLaughlin (1987), Glas and Meijer (2003), and Molenaar and Hoijtink (1990). However, these indices are only defined for tests with dichotomous items. This paper extends…
Descriptors: Test Format, Goodness of Fit, Item Response Theory, Error Patterns
Romero, Sonia J.; Ordoñez, Xavier G.; Ponsoda, Vincente; Revuelta, Javier – Psicologica: International Journal of Methodology and Experimental Psychology, 2014
Cognitive Diagnostic Models (CDMs) aim to provide information about the degree to which individuals have mastered specific attributes that underlie the success of these individuals on test items. The Q-matrix is a key element in the application of CDMs, because contains links item-attributes representing the cognitive structure proposed for solve…
Descriptors: Evaluation Methods, Q Methodology, Matrices, Sampling
Xiang, Rui – ProQuest LLC, 2013
A key issue of cognitive diagnostic models (CDMs) is the correct identification of Q-matrix which indicates the relationship between attributes and test items. Previous CDMs typically assumed a known Q-matrix provided by domain experts such as those who developed the questions. However, misspecifications of Q-matrix had been discovered in the past…
Descriptors: Diagnostic Tests, Cognitive Processes, Matrices, Test Items
Ranger, Jochen; Kuhn, Jorg-Tobias – Journal of Educational Measurement, 2012
The information matrix can equivalently be determined via the expectation of the Hessian matrix or the expectation of the outer product of the score vector. The identity of these two matrices, however, is only valid in case of a correctly specified model. Therefore, differences between the two versions of the observed information matrix indicate…
Descriptors: Goodness of Fit, Item Response Theory, Models, Matrices
Tian, Wei; Cai, Li; Thissen, David; Xin, Tao – Educational and Psychological Measurement, 2013
In item response theory (IRT) modeling, the item parameter error covariance matrix plays a critical role in statistical inference procedures. When item parameters are estimated using the EM algorithm, the parameter error covariance matrix is not an automatic by-product of item calibration. Cai proposed the use of Supplemented EM algorithm for…
Descriptors: Item Response Theory, Computation, Matrices, Statistical Inference
Rupp, Andre A.; Templin, Jonathan – Educational and Psychological Measurement, 2008
This article reports a study that investigated the effects of Q-matrix misspecifications on parameter estimates and misclassification rates for the deterministic-input, noisy "and" gate (DINA) model, which is a restricted latent class model for multiple classifications of respondents that can be useful for cognitively motivated diagnostic…
Descriptors: Program Effectiveness, Item Response Theory, Computation, Classification
Peer reviewedDe Champlain, Andre; Gessaroli, Marc E. – Applied Measurement in Education, 1998
Type I error rates and rejection rates for three-dimensionality assessment procedures were studied with data sets simulated to reflect short tests and small samples. Results show that the G-squared difference test (D. Bock, R. Gibbons, and E. Muraki, 1988) suffered from a severely inflated Type I error rate at all conditions simulated. (SLD)
Descriptors: Item Response Theory, Matrices, Sample Size, Simulation
Finch, Holmes – Journal of Educational Measurement, 2006
Nonlinear factor analysis is a tool commonly used by measurement specialists to identify both the presence and nature of multidimensionality in a set of test items, an important issue given that standard Item Response Theory models assume a unidimensional latent structure. Results from most factor-analytic algorithms include loading matrices,…
Descriptors: Test Items, Simulation, Factor Structure, Factor Analysis
De Champlain, Andre – 1996
The usefulness of a goodness-of-fit index proposed by R. P. McDonald (1989) was investigated with regard to assessing the dimensionality of item response matrices. The m subscript k index, which is based on an estimate of the noncentrality parameter of the noncentral chi-square distribution, possesses several advantages over traditional tests of…
Descriptors: Chi Square, Cutting Scores, Goodness of Fit, Item Response Theory
De Champlain, Andre F. – 1999
The purpose of this study was to examine empirical Type I error rates and rejection rates for three dimensionality assessment procedures with data sets simulated to reflect short tests and small samples. The TESTFACT G superscript 2 difference test suffered from an inflated Type I error rate with unidimensional data sets, while the approximate chi…
Descriptors: Admission (School), College Entrance Examinations, Item Response Theory, Law Schools
Peer reviewedBaker, Frank B. – Applied Psychological Measurement, 1993
Using simulation, the effect that misspecification of elements in the weight matrix has on estimates of basic parameters of the linear logistic test model was studied. Results indicate that, because specifying elements of the weight matrix is a subjective process, it must be done with great care. (SLD)
Descriptors: Error Patterns, Estimation (Mathematics), Item Response Theory, Matrices
Previous Page | Next Page »
Pages: 1 | 2
Direct link
