NotesFAQContact Us
Collection
Advanced
Search Tips
Source
Educational and Psychological…153
Audience
Researchers1
Laws, Policies, & Programs
No Child Left Behind Act 20012
What Works Clearinghouse Rating
Showing 1 to 15 of 153 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Schweizer, Karl; Gold, Andreas; Krampen, Dorothea – Educational and Psychological Measurement, 2023
In modeling missing data, the missing data latent variable of the confirmatory factor model accounts for systematic variation associated with missing data so that replacement of what is missing is not required. This study aimed at extending the modeling missing data approach to tetrachoric correlations as input and at exploring the consequences of…
Descriptors: Data, Models, Factor Analysis, Correlation
Peer reviewed Peer reviewed
Direct linkDirect link
Yan Xia; Selim Havan – Educational and Psychological Measurement, 2024
Although parallel analysis has been found to be an accurate method for determining the number of factors in many conditions with complete data, its application under missing data is limited. The existing literature recommends that, after using an appropriate multiple imputation method, researchers either apply parallel analysis to every imputed…
Descriptors: Data Interpretation, Factor Analysis, Statistical Inference, Research Problems
Peer reviewed Peer reviewed
Direct linkDirect link
Engelhard, George – Educational and Psychological Measurement, 2023
The purpose of this study is to introduce a functional approach for modeling unfolding response data. Functional data analysis (FDA) has been used for examining cumulative item response data, but a functional approach has not been systematically used with unfolding response processes. A brief overview of FDA is presented and illustrated within the…
Descriptors: Data Analysis, Models, Responses, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Su, Hsu-Lin; Chen, Po-Hsi – Educational and Psychological Measurement, 2023
The multidimensional mixture data structure exists in many test (or inventory) conditions. Heterogeneity also relatively exists in populations. Still, some researchers are interested in deciding to which subpopulation a participant belongs according to the participant's factor pattern. Thus, in this study, we proposed three analysis procedures…
Descriptors: Data Analysis, Correlation, Classification, Factor Structure
Peer reviewed Peer reviewed
Direct linkDirect link
Wu, Tong; Kim, Stella Y.; Westine, Carl – Educational and Psychological Measurement, 2023
For large-scale assessments, data are often collected with missing responses. Despite the wide use of item response theory (IRT) in many testing programs, however, the existing literature offers little insight into the effectiveness of various approaches to handling missing responses in the context of scale linking. Scale linking is commonly used…
Descriptors: Data Analysis, Responses, Statistical Analysis, Measurement
Ziying Li; A. Corinne Huggins-Manley; Walter L. Leite; M. David Miller; Eric A. Wright – Educational and Psychological Measurement, 2022
The unstructured multiple-attempt (MA) item response data in virtual learning environments (VLEs) are often from student-selected assessment data sets, which include missing data, single-attempt responses, multiple-attempt responses, and unknown growth ability across attempts, leading to a complex and complicated scenario for using this kind of…
Descriptors: Sequential Approach, Item Response Theory, Data, Simulation
Peer reviewed Peer reviewed
Direct linkDirect link
Agley, Jon; Tidd, David; Jun, Mikyoung; Eldridge, Lori; Xiao, Yunyu; Sussman, Steve; Jayawardene, Wasantha; Agley, Daniel; Gassman, Ruth; Dickinson, Stephanie L. – Educational and Psychological Measurement, 2021
Prospective longitudinal data collection is an important way for researchers and evaluators to assess change. In school-based settings, for low-risk and/or likely-beneficial interventions or surveys, data quality and ethical standards are both arguably stronger when using a waiver of parental consent--but doing so often requires the use of…
Descriptors: Data Analysis, Longitudinal Studies, Data Collection, Intervention
Peer reviewed Peer reviewed
Direct linkDirect link
Cheng, Ying; Shao, Can – Educational and Psychological Measurement, 2022
Computer-based and web-based testing have become increasingly popular in recent years. Their popularity has dramatically expanded the availability of response time data. Compared to the conventional item response data that are often dichotomous or polytomous, response time has the advantage of being continuous and can be collected in an…
Descriptors: Reaction Time, Test Wiseness, Computer Assisted Testing, Simulation
Peer reviewed Peer reviewed
Direct linkDirect link
Goretzko, David; Heumann, Christian; Bühner, Markus – Educational and Psychological Measurement, 2020
Exploratory factor analysis is a statistical method commonly used in psychological research to investigate latent variables and to develop questionnaires. Although such self-report questionnaires are prone to missing values, there is not much literature on this topic with regard to exploratory factor analysis--and especially the process of factor…
Descriptors: Factor Analysis, Data Analysis, Research Methodology, Psychological Studies
Peer reviewed Peer reviewed
Direct linkDirect link
Zopluoglu, Cengiz – Educational and Psychological Measurement, 2020
A mixture extension of Samejima's continuous response model for continuous measurement outcomes and its estimation through a heuristic approach based on limited-information factor analysis is introduced. Using an empirical data set, it is shown that two groups of respondents that differ both qualitatively and quantitatively in their response…
Descriptors: Item Response Theory, Measurement, Models, Heuristics
Peer reviewed Peer reviewed
Direct linkDirect link
van Dijk, Wilhelmina; Schatschneider, Christopher; Al Otaiba, Stephanie; Hart, Sara A. – Educational and Psychological Measurement, 2022
Complex research questions often need large samples to obtain accurate estimates of parameters and adequate power. Combining extant data sets into a large, pooled data set is one way this can be accomplished without expending resources. Measurement invariance (MI) modeling is an established approach to ensure participant scores are on the same…
Descriptors: Sample Size, Data Analysis, Goodness of Fit, Measurement
Peer reviewed Peer reviewed
Direct linkDirect link
Mansolf, Maxwell; Vreeker, Annabel; Reise, Steven P.; Freimer, Nelson B.; Glahn, David C.; Gur, Raquel E.; Moore, Tyler M.; Pato, Carlos N.; Pato, Michele T.; Palotie, Aarno; Holm, Minna; Suvisaari, Jaana; Partonen, Timo; Kieseppä, Tuula; Paunio, Tiina; Boks, Marco; Kahn, René; Ophoff, Roel A.; Bearden, Carrie E.; Loohuis, Loes Olde; Teshiba, Terri; deGeorge, Daniella; Bilder, Robert M. – Educational and Psychological Measurement, 2020
Large-scale studies spanning diverse project sites, populations, languages, and measurements are increasingly important to relate psychological to biological variables. National and international consortia already are collecting and executing mega-analyses on aggregated data from individuals, with different measures on each person. In this…
Descriptors: Item Response Theory, Data Analysis, Measurement, Validity
Peer reviewed Peer reviewed
Direct linkDirect link
Montoya, Amanda K.; Edwards, Michael C. – Educational and Psychological Measurement, 2021
Model fit indices are being increasingly recommended and used to select the number of factors in an exploratory factor analysis. Growing evidence suggests that the recommended cutoff values for common model fit indices are not appropriate for use in an exploratory factor analysis context. A particularly prominent problem in scale evaluation is the…
Descriptors: Goodness of Fit, Factor Analysis, Cutting Scores, Correlation
Xue, Kang; Huggins-Manley, Anne Corinne; Leite, Walter – Educational and Psychological Measurement, 2022
In data collected from virtual learning environments (VLEs), item response theory (IRT) models can be used to guide the ongoing measurement of student ability. However, such applications of IRT rely on unbiased item parameter estimates associated with test items in the VLE. Without formal piloting of the items, one can expect a large amount of…
Descriptors: Virtual Classrooms, Artificial Intelligence, Item Response Theory, Item Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Spratto, Elisabeth M.; Leventhal, Brian C.; Bandalos, Deborah L. – Educational and Psychological Measurement, 2021
In this study, we examined the results and interpretations produced from two different IRTree models--one using paths consisting of only dichotomous decisions, and one using paths consisting of both dichotomous and polytomous decisions. We used data from two versions of an impulsivity measure. In the first version, all the response options had…
Descriptors: Comparative Analysis, Item Response Theory, Decision Making, Data Analysis
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9  |  10  |  11