Publication Date
| In 2026 | 0 |
| Since 2025 | 2 |
| Since 2022 (last 5 years) | 15 |
| Since 2017 (last 10 years) | 70 |
| Since 2007 (last 20 years) | 227 |
Descriptor
| Correlation | 265 |
| Factor Analysis | 265 |
| Foreign Countries | 107 |
| Item Response Theory | 102 |
| Emotional Response | 71 |
| Statistical Analysis | 54 |
| Feedback (Response) | 51 |
| Questionnaires | 49 |
| Measures (Individuals) | 47 |
| Models | 44 |
| Scores | 44 |
| More ▼ | |
Source
Author
| Chan, Jason C. | 3 |
| Cole, Rachel | 3 |
| Kemple, James J. | 3 |
| Lent, Jessica | 3 |
| McCormick, Meghan | 3 |
| Nathanson, Lori | 3 |
| Segeritz, Micha | 3 |
| Adkins, Dorothy C. | 2 |
| Blömeke, Sigrid | 2 |
| Brown, Gavin T. L. | 2 |
| Coppola, Gabrielle | 2 |
| More ▼ | |
Publication Type
Education Level
Audience
| Researchers | 2 |
| Practitioners | 1 |
| Students | 1 |
Location
| Turkey | 14 |
| China | 10 |
| Canada | 8 |
| South Korea | 6 |
| United Kingdom | 6 |
| Australia | 5 |
| Finland | 5 |
| Greece | 5 |
| Japan | 5 |
| California | 4 |
| Germany | 4 |
| More ▼ | |
Laws, Policies, & Programs
| No Child Left Behind Act 2001 | 1 |
Assessments and Surveys
What Works Clearinghouse Rating
Hoang V. Nguyen; Niels G. Waller – Educational and Psychological Measurement, 2024
We conducted an extensive Monte Carlo study of factor-rotation local solutions (LS) in multidimensional, two-parameter logistic (M2PL) item response models. In this study, we simulated more than 19,200 data sets that were drawn from 96 model conditions and performed more than 7.6 million rotations to examine the influence of (a) slope parameter…
Descriptors: Monte Carlo Methods, Item Response Theory, Correlation, Error of Measurement
Seiyon M. Lee; Sami Baral; Hongming Chip Li; Li Cheng; Shan Zhang; Carly S. Thorp; Jennifer St. John; Tamisha Thompson; Neil Heffernan; Anthony F. Botelho – Journal of Educational Data Mining, 2025
Teachers often use open-ended questions to promote students' deeper understanding of the content. These questions are particularly useful in K-12 mathematics education, as they provide richer insights into students' problem-solving processes compared to closed-ended questions. However, they are also challenging to implement in educational…
Descriptors: Feedback (Response), Taxonomy, Data Analysis, Middle School Mathematics
Guo, Wenjing; Choi, Youn-Jeng – Educational and Psychological Measurement, 2023
Determining the number of dimensions is extremely important in applying item response theory (IRT) models to data. Traditional and revised parallel analyses have been proposed within the factor analysis framework, and both have shown some promise in assessing dimensionality. However, their performance in the IRT framework has not been…
Descriptors: Item Response Theory, Evaluation Methods, Factor Analysis, Guidelines
Yuanfang Liu; Mark H. C. Lai; Ben Kelcey – Structural Equation Modeling: A Multidisciplinary Journal, 2024
Measurement invariance holds when a latent construct is measured in the same way across different levels of background variables (continuous or categorical) while controlling for the true value of that construct. Using Monte Carlo simulation, this paper compares the multiple indicators, multiple causes (MIMIC) model and MIMIC-interaction to a…
Descriptors: Classification, Accuracy, Error of Measurement, Correlation
Celen, Umit; Aybek, Eren Can – International Journal of Assessment Tools in Education, 2022
Item analysis is performed by developers as an integral part of the scale development process. Thus, items are excluded from the scale depending on the item analysis prior to the factor analysis. Existing item discrimination indices are calculated based on correlation, yet items with different response patterns are likely to have a similar item…
Descriptors: Likert Scales, Factor Analysis, Item Analysis, Correlation
Yongbi Zhi; Ali Derakhshan – Asia-Pacific Education Researcher, 2025
Considering the key role of organizational commitment in teachers' professional performance, many researchers have studied the predictors of this variable in different educational institutions. However, a short glance at the pertinent literature displays that most of the previous studies on this variable have been conducted in general education…
Descriptors: Language Teachers, Self Control, Factor Analysis, Measurement Techniques
D'Urso, E. Damiano; Tijmstra, Jesper; Vermunt, Jeroen K.; De Roover, Kim – Educational and Psychological Measurement, 2023
Assessing the measurement model (MM) of self-report scales is crucial to obtain valid measurements of individuals' latent psychological constructs. This entails evaluating the number of measured constructs and determining which construct is measured by which item. Exploratory factor analysis (EFA) is the most-used method to evaluate these…
Descriptors: Factor Analysis, Measurement Techniques, Self Evaluation (Individuals), Psychological Patterns
Chylíková, Johana – International Journal of Social Research Methodology, 2020
This study explores the acquiescent response style (ARS) among respondents in the Czech Republic. To analyse ARS, confirmatory factor analysis (CFA) was employed and the response style (RS) was modelled as a latent variable. The RS factor in the CFA model must be validated by its relationship to education and age, i.e. proxies of cognitive…
Descriptors: Foreign Countries, Response Style (Tests), Age Differences, Educational Attainment
Hartono, Wahyu; Hadi, Samsul; Rosnawati, Raden; Retnawati, Heri – Pegem Journal of Education and Instruction, 2023
Researchers design diagnostic assessments to measure students' knowledge structures and processing skills to provide information about their cognitive attribute. The purpose of this study is to determine the instrument's validity and score reliability, as well as to investigate the use of classical test theory to identify item characteristics. The…
Descriptors: Diagnostic Tests, Test Validity, Item Response Theory, Content Validity
Factor Structure and Psychometric Properties of the Digital Stress Scale in a Chinese College Sample
Chunlei Gao; Mingqing Jian; Ailin Yuan – SAGE Open, 2024
The Digital Stress Scale (DSS) is used to measure digital stress, which is the perceived stress and anxiety associated with social media use. In this study, the Chinese version of the DSS was validated using a sample of 721 Chinese college students, 321 males and 400 females (KMO = 0.923; Bartlett = 5,058.492, p < 0.001). Confirmatory factor…
Descriptors: Factor Structure, Factor Analysis, Psychometrics, Anxiety
Chung, Seungwon; Houts, Carrie – Measurement: Interdisciplinary Research and Perspectives, 2020
Advanced modeling of item response data through the item response theory (IRT) or item factor analysis frameworks is becoming increasingly popular. In the social and behavioral sciences, the underlying structure of tests/assessments is often multidimensional (i.e., more than 1 latent variable/construct is represented in the items). This review…
Descriptors: Item Response Theory, Evaluation Methods, Models, Factor Analysis
PaaBen, Benjamin; Dywel, Malwina; Fleckenstein, Melanie; Pinkwart, Niels – International Educational Data Mining Society, 2022
Item response theory (IRT) is a popular method to infer student abilities and item difficulties from observed test responses. However, IRT struggles with two challenges: How to map items to skills if multiple skills are present? And how to infer the ability of new students that have not been part of the training data? Inspired by recent advances…
Descriptors: Item Response Theory, Test Items, Item Analysis, Inferences
Baris Pekmezci, Fulya; Gulleroglu, H. Deniz – Eurasian Journal of Educational Research, 2019
Purpose: This study aims to investigate the orthogonality assumption, which restricts the use of Bifactor item response theory under different conditions. Method: Data of the study have been obtained in accordance with the Bifactor model. It has been produced in accordance with two different models (Model 1 and Model 2) in a simulated way.…
Descriptors: Item Response Theory, Accuracy, Item Analysis, Correlation
Park, Sung Eun; Ahn, Soyeon; Zopluoglu, Cengiz – Educational and Psychological Measurement, 2021
This study presents a new approach to synthesizing differential item functioning (DIF) effect size: First, using correlation matrices from each study, we perform a multigroup confirmatory factor analysis (MGCFA) that examines measurement invariance of a test item between two subgroups (i.e., focal and reference groups). Then we synthesize, across…
Descriptors: Item Analysis, Effect Size, Difficulty Level, Monte Carlo Methods
Harder, Joseph; Abuhamdieh, Ayman H.; Weber, Peter – Journal of Educators Online, 2021
The present study operationalizes perceived positive regard in the form of a practical measure that can be applied in distance delivery settings. We collected data by surveying distance students at our university. The questions pertained to the quality of learning and the positive regard of the instructor as perceived by the students. Analytical…
Descriptors: Distance Education, College Students, Student Attitudes, Likert Scales

Peer reviewed
Direct link
