NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 27 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Zeynep Uzun; Tuncay Ögretmen – Large-scale Assessments in Education, 2025
This study aimed to evaluate the item model fit by equating the forms of the PISA 2018 mathematics subtest with concurrent common items equating in samples from Türkiye, the UK, and Italy. The answers given in mathematics subtest Forms 2, 8, and 12 were used in this context. Analyzes were performed using the Dichotomous Rasch Model in the WINSTEPS…
Descriptors: Item Response Theory, Test Items, Foreign Countries, Mathematics Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Lu, Jing; Wang, Chun – Journal of Educational Measurement, 2020
Item nonresponses are prevalent in standardized testing. They happen either when students fail to reach the end of a test due to a time limit or quitting, or when students choose to omit some items strategically. Oftentimes, item nonresponses are nonrandom, and hence, the missing data mechanism needs to be properly modeled. In this paper, we…
Descriptors: Item Response Theory, Test Items, Standardized Tests, Responses
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Mehmet Fatih Doguyurt; Seref Tan – International Journal of Assessment Tools in Education, 2025
This study investigates the impact of violating the local item independence assumption by loading certain items onto a second dimension on test equating errors in unidimensional and dichotomous tests. The research was designed as a simulation study, using data generated based on the PISA 2018 mathematics exam. Analyses were conducted under 36…
Descriptors: Equated Scores, Test Items, Mathematics Tests, International Assessment
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Kartianom Kartianom; Heri Retnawati; Kana Hidayati – Journal of Pedagogical Research, 2024
Conducting a fair test is important for educational research. Unfair assessments can lead to gender disparities in academic achievement, ultimately resulting in disparities in opportunities, wages, and career choice. Differential Item Function [DIF] analysis is presented to provide evidence of whether the test is truly fair, where it does not harm…
Descriptors: Foreign Countries, Test Bias, Item Response Theory, Test Theory
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Soysal, Sümeyra – Participatory Educational Research, 2023
Applying a measurement instrument developed in a specific country to other countries raise a critical and important question of interest in especially cross-cultural studies. Confirmatory factor analysis (CFA) is the most preferred and used method to examine the cross-cultural applicability of measurement tools. Although CFA is a sophisticated…
Descriptors: Generalization, Cross Cultural Studies, Measurement Techniques, Factor Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Marcq, Kseniia; Braeken, Johan – Assessment in Education: Principles, Policy & Practice, 2022
Communication of International Large-Scale Assessment (ILSA) results is dominated by reporting average country achievement scores that conceal individual differences between pupils, schools, and items. Educational research primarily focuses on examining differences between pupils and schools, while differences between items are overlooked. Using a…
Descriptors: Foreign Countries, International Assessment, Achievement Tests, Secondary School Students
Peer reviewed Peer reviewed
Direct linkDirect link
Karadavut, Tugba; Cohen, Allan S.; Kim, Seock-Ho – Measurement: Interdisciplinary Research and Perspectives, 2020
Mixture Rasch (MixRasch) models conventionally assume normal distributions for latent ability. Previous research has shown that the assumption of normality is often unmet in educational and psychological measurement. When normality is assumed, asymmetry in the actual latent ability distribution has been shown to result in extraction of spurious…
Descriptors: Item Response Theory, Ability, Statistical Distributions, Sample Size
Peer reviewed Peer reviewed
Direct linkDirect link
Sahin, Füsun; Colvin, Kimberly F. – Large-scale Assessments in Education, 2020
The item responses of examinees who rapid-guess, who do not spend enough time reading and engaging with an item, will not reflect their true ability on that item. Rapid-disengagement refers to rapidly selecting a response to multiple-choice items (i.e., rapid-guess), omitting items, or providing short-unrelated answers to open-ended items in an…
Descriptors: Guessing (Tests), Item Response Theory, Reaction Time, Learner Engagement
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Basman, Münevver; Kutlu, Ömer – International Journal of Progressive Education, 2020
Mathematical knowledge and skills are needed to find solutions to the problems encountered in daily life. Although individuals are given the opportunity to receive equal education, it is seen that there are differences in the achievement of individuals. Individual-based factors can affect the achievement of individuals. One of the most important…
Descriptors: Test Bias, Mathematics Achievement, Gender Differences, Mathematics Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Ayan, Cansu; Baris Pekmezci, Fulya – International Journal of Assessment Tools in Education, 2021
Testlets have advantages such as making it possible to measure higher-order thinking skills and saving time, which are accepted in the literature. For this reason, they have often been preferred in many implementations from in-class assessments to large-scale assessments. Because of increased usage of testlets, the following questions are…
Descriptors: Foreign Countries, International Assessment, Secondary School Students, Achievement Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Ercikan, Kadriye; Guo, Hongwen; He, Qiwei – Educational Assessment, 2020
Comparing group is one of the key uses of large-scale assessment results, which are used to gain insights to inform policy and practice and to examine the comparability of scores and score meaning. Such comparisons typically focus on examinees' final answers and responses to test questions, ignoring response process differences groups may engage…
Descriptors: Data Use, Responses, Comparative Analysis, Test Bias
Peer reviewed Peer reviewed
Direct linkDirect link
von Davier, Matthias; Yamamoto, Kentaro; Shin, Hyo Jeong; Chen, Henry; Khorramdel, Lale; Weeks, Jon; Davis, Scott; Kong, Nan; Kandathil, Mat – Assessment in Education: Principles, Policy & Practice, 2019
Based on concerns about the item response theory (IRT) linking approach used in the Programme for International Student Assessment (PISA) until 2012 as well as the desire to include new, more complex, interactive items with the introduction of computer-based assessments, alternative IRT linking methods were implemented in the 2015 PISA round. The…
Descriptors: Achievement Tests, Foreign Countries, Secondary School Students, International Assessment
Peer reviewed Peer reviewed
Direct linkDirect link
Steinmann, Isa; Braeken, Johan; Strietholt, Rolf – AERA Online Paper Repository, 2021
This study investigates consistent and inconsistent respondents to mixed-worded questionnaire scales in large-scale assessments. Mixed-worded scales contain both positively and negatively worded items and are universally applied in different survey and content areas. Due to the changing wording, these scales require a more careful reading and…
Descriptors: Questionnaires, Measurement, Test Items, Response Style (Tests)
Peer reviewed Peer reviewed
Direct linkDirect link
Pettersen, Andreas; Braeken, Johan – International Journal of Science and Mathematics Education, 2019
The implementation of mathematical competencies in school curricula requires assessment instruments to be aligned with this new view on mathematical mastery. However, there are concerns over whether existing assessments capture the wide variety of cognitive skills and abilities that constitute mathematical competence. The current study applied an…
Descriptors: Mathematics Instruction, Mathematics Skills, Mathematics Tests, Cognitive Ability
Peer reviewed Peer reviewed
Direct linkDirect link
Ivanova, Alina; Kardanova, Elena; Merrell, Christine; Tymms, Peter; Hawker, David – Assessment in Education: Principles, Policy & Practice, 2018
Is it possible to compare the results in assessments of mathematics across countries with different curricula, traditions and age of starting school? As part of the iPIPS project, a Russian version of the iPIPS baseline assessment was developed and trial data were available from about 300 Russian children at the start and end of their first year…
Descriptors: Mathematics Instruction, Foreign Countries, Mathematics Tests, Item Response Theory
Previous Page | Next Page »
Pages: 1  |  2