NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 35 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Wu, Tong; Kim, Stella Y.; Westine, Carl – Educational and Psychological Measurement, 2023
For large-scale assessments, data are often collected with missing responses. Despite the wide use of item response theory (IRT) in many testing programs, however, the existing literature offers little insight into the effectiveness of various approaches to handling missing responses in the context of scale linking. Scale linking is commonly used…
Descriptors: Data Analysis, Responses, Statistical Analysis, Measurement
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Sünbül, Seçil Ömür – International Journal of Evaluation and Research in Education, 2018
In this study, it was aimed to investigate the impact of different missing data handling methods on DINA model parameter estimation and classification accuracy. In the study, simulated data were used and the data were generated by manipulating the number of items and sample size. In the generated data, two different missing data mechanisms…
Descriptors: Data, Test Items, Sample Size, Statistical Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Harring, Jeffrey R.; Johnson, Tessa L. – Educational Measurement: Issues and Practice, 2020
In this digital ITEMS module, Dr. Jeffrey Harring and Ms. Tessa Johnson introduce the linear mixed effects (LME) model as a flexible general framework for simultaneously modeling continuous repeated measures data with a scientifically defensible function that adequately summarizes both individual change as well as the average response. The module…
Descriptors: Educational Assessment, Data Analysis, Longitudinal Studies, Case Studies
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Paul J. Walter; Edward Nuhfer; Crisel Suarez – Numeracy, 2021
We introduce an approach for making a quantitative comparison of the item response curves (IRCs) of any two populations on a multiple-choice test instrument. In this study, we employ simulated and actual data. We apply our approach to a dataset of 12,187 participants on the 25-item Science Literacy Concept Inventory (SLCI), which includes ample…
Descriptors: Item Analysis, Multiple Choice Tests, Simulation, Data Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Schweizer, Karl; Troche, Stefan – Educational and Psychological Measurement, 2018
In confirmatory factor analysis quite similar models of measurement serve the detection of the difficulty factor and the factor due to the item-position effect. The item-position effect refers to the increasing dependency among the responses to successively presented items of a test whereas the difficulty factor is ascribed to the wide range of…
Descriptors: Investigations, Difficulty Level, Factor Analysis, Models
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Pichette, François; Béland, Sébastien; Jolani, Shahab; Lesniewska, Justyna – Studies in Second Language Learning and Teaching, 2015
Researchers are frequently confronted with unanswered questions or items on their questionnaires and tests, due to factors such as item difficulty, lack of testing time, or participant distraction. This paper first presents results from a poll confirming previous claims (Rietveld & van Hout, 2006; Schafer & Graham, 2002) that data…
Descriptors: Language Research, Data Analysis, Simulation, Item Analysis
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Kalkan, Ömür Kaya; Kara, Yusuf; Kelecioglu, Hülya – International Journal of Assessment Tools in Education, 2018
Missing data is a common problem in datasets that are obtained by administration of educational and psychological tests. It is widely known that existence of missing observations in data can lead to serious problems such as biased parameter estimates and inflation of standard errors. Most of the missing data imputation methods are focused on…
Descriptors: Item Response Theory, Statistical Analysis, Data, Test Items
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Yalcin, Seher – Eurasian Journal of Educational Research, 2018
Purpose: Studies in the literature have generally demonstrated that the causes of differential item functioning (DIF) are complex and not directly related to defined groups. The purpose of this study is to determine the DIF according to the mixture item response theory (MixIRT) model, based on the latent group approach, as well as the…
Descriptors: Item Response Theory, Test Items, Test Bias, Error of Measurement
Peer reviewed Peer reviewed
Direct linkDirect link
Murawska, Jaclyn M.; Walker, David A. – Mid-Western Educational Researcher, 2017
In this commentary, we offer a set of visual tools that can assist education researchers, especially those in the field of mathematics, in developing cohesiveness from a mixed methods perspective, commencing at a study's research questions and literature review, through its data collection and analysis, and finally to its results. This expounds…
Descriptors: Mixed Methods Research, Research Methodology, Visual Aids, Research Tools
Peer reviewed Peer reviewed
Direct linkDirect link
Gómez-Benito, Juana; Hidalgo, Maria Dolores; Zumbo, Bruno D. – Educational and Psychological Measurement, 2013
The objective of this article was to find an optimal decision rule for identifying polytomous items with large or moderate amounts of differential functioning. The effectiveness of combining statistical tests with effect size measures was assessed using logistic discriminant function analysis and two effect size measures: R[superscript 2] and…
Descriptors: Item Analysis, Test Items, Effect Size, Statistical Analysis
Livingston, Samuel A. – Educational Testing Service, 2014
This booklet grew out of a half-day class on equating that author Samuel Livingston teaches for new statistical staff at Educational Testing Service (ETS). The class is a nonmathematical introduction to the topic, emphasizing conceptual understanding and practical applications. The class consists of illustrated lectures, interspersed with…
Descriptors: Equated Scores, Scoring, Self Evaluation (Individuals), Scores
Peer reviewed Peer reviewed
Direct linkDirect link
Long, Caroline; Wendt, Heike – African Journal of Research in Mathematics, Science and Technology Education, 2017
South Africa participated in TIMSS from 1995 to 2015. Over these two decades, some positive changes have been reported on the aggregated mathematics performance patterns of South African learners. This paper focuses on the achievement patterns of South Africa's high-performing Grade 9 learners (n = 3378) in comparison with similar subsamples of…
Descriptors: Foreign Countries, Comparative Analysis, Multiplication, Comparative Education
Peer reviewed Peer reviewed
Direct linkDirect link
Svetina, Dubravka – Educational and Psychological Measurement, 2013
The purpose of this study was to investigate the effect of complex structure on dimensionality assessment in noncompensatory multidimensional item response models using dimensionality assessment procedures based on DETECT (dimensionality evaluation to enumerate contributing traits) and NOHARM (normal ogive harmonic analysis robust method). Five…
Descriptors: Item Response Theory, Statistical Analysis, Computation, Test Length
Peer reviewed Peer reviewed
Direct linkDirect link
Olson, Carol H.; Henry, Diana A.; Kliner, Ashley Peck; Kyllo, Alissa; Richter, Chelsea Munson; Charley, Jane; Whitcher, Meagan Chapman; Reinke, Katherine Roth; Tysver, Chelsay Horner; Wagner, Lacey; Walworth, Jessica – Journal of Occupational Therapy, Schools & Early Intervention, 2016
This pre- and posttest multiple-case study examined the effectiveness and usability of the Sensory Processing Measure-Preschool Quick Tips (SPM-P QT) by key stakeholders (parents and teachers) for implementing data-driven intervention to address sensory processing challenges. The Sensory Processing Measure-Preschool (SPM-P) was administered as an…
Descriptors: Preschool Education, Preschool Children, Early Childhood Education, Data
Li, Dan; Benton, Stephen L.; Brown, Ron; Sullivan, Patricia; Ryalls, Kenneth R. – IDEA Center, Inc., 2016
This report describes statistical analyses performed on data collected in the Spring of 2015 from the pilot study of proposed revised and new items in the IDEA Student Ratings of Instruction (SRI) system. Described are the methods employed, results obtained, and decisions made in selecting items for the updated instruments. The procedures occurred…
Descriptors: Statistical Analysis, Rating Scales, Student Evaluation of Teacher Performance, Course Evaluation
Previous Page | Next Page »
Pages: 1  |  2  |  3