Publication Date
| In 2026 | 0 |
| Since 2025 | 1 |
| Since 2022 (last 5 years) | 3 |
| Since 2017 (last 10 years) | 14 |
| Since 2007 (last 20 years) | 21 |
Descriptor
Source
Author
| Debeer, Dries | 3 |
| Janssen, Rianne | 3 |
| Shin, Hyo Jeong | 3 |
| Yamamoto, Kentaro | 3 |
| Braeken, Johan | 2 |
| Goldhammer, Frank | 2 |
| Khorramdel, Lale | 2 |
| Sälzer, Christine | 2 |
| Zehner, Fabian | 2 |
| Ark, Tavinder K. | 1 |
| Asil, Mustafa | 1 |
| More ▼ | |
Publication Type
| Reports - Research | 17 |
| Journal Articles | 16 |
| Reports - Evaluative | 4 |
| Speeches/Meeting Papers | 2 |
| Information Analyses | 1 |
Education Level
| Secondary Education | 18 |
| Elementary Education | 3 |
| Elementary Secondary Education | 3 |
| Grade 4 | 3 |
| Intermediate Grades | 3 |
| Junior High Schools | 3 |
| Middle Schools | 3 |
| Grade 9 | 2 |
| High Schools | 2 |
| Early Childhood Education | 1 |
| Grade 3 | 1 |
| More ▼ | |
Audience
Location
| Australia | 4 |
| Germany | 4 |
| Netherlands | 3 |
| Finland | 2 |
| Indonesia | 2 |
| New Zealand | 2 |
| Norway | 2 |
| Azerbaijan | 1 |
| Canada | 1 |
| China | 1 |
| China (Shanghai) | 1 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
| Program for International… | 21 |
| Progress in International… | 4 |
| Trends in International… | 4 |
What Works Clearinghouse Rating
Militsa G. Ivanova; Hanna Eklöf; Michalis P. Michaelides – Journal of Applied Testing Technology, 2025
Digital administration of assessments allows for the collection of process data indices, such as response time, which can serve as indicators of rapid-guessing and examinee test-taking effort. Setting a time threshold is essential to distinguish effortful from effortless behavior using item response times. Threshold identification methods may…
Descriptors: Test Items, Computer Assisted Testing, Reaction Time, Achievement Tests
Debeer, Dries; Janssen, Rianne; De Boeck, Paul – Journal of Educational Measurement, 2017
When dealing with missing responses, two types of omissions can be discerned: items can be skipped or not reached by the test taker. When the occurrence of these omissions is related to the proficiency process the missingness is nonignorable. The purpose of this article is to present a tree-based IRT framework for modeling responses and omissions…
Descriptors: Item Response Theory, Test Items, Responses, Testing Problems
Marcq, Kseniia; Braeken, Johan – Assessment in Education: Principles, Policy & Practice, 2022
Communication of International Large-Scale Assessment (ILSA) results is dominated by reporting average country achievement scores that conceal individual differences between pupils, schools, and items. Educational research primarily focuses on examining differences between pupils and schools, while differences between items are overlooked. Using a…
Descriptors: Foreign Countries, International Assessment, Achievement Tests, Secondary School Students
Kurniasih, Nia; Emilia, Emi; Sujatna, Eva Tuckyta Sari – International Journal of Language Testing, 2023
This study aimed at evaluating a PISA-like reading test developed by teachers participating in the teacher training for teaching PISA-like reading. To serve this purpose, an experimental test was administered to 107 students aged 15-16 using a set of text and questions constructed according to the criteria of the PISA Reading test Level 1. Item…
Descriptors: International Assessment, Foreign Countries, Achievement Tests, Secondary School Students
Trendtel, Matthias; Robitzsch, Alexander – Journal of Educational and Behavioral Statistics, 2021
A multidimensional Bayesian item response model is proposed for modeling item position effects. The first dimension corresponds to the ability that is to be measured; the second dimension represents a factor that allows for individual differences in item position effects called persistence. This model allows for nonlinear item position effects on…
Descriptors: Bayesian Statistics, Item Response Theory, Test Items, Test Format
Yamamoto, Kentaro; Shin, Hyo Jeong; Khorramdel, Lale – OECD Publishing, 2019
This paper describes and evaluates a multistage adaptive testing (MSAT) design that was implemented for the Programme for International Student Assessment (PISA) 2018 main survey for the major domain of Reading. Through a simulation study, recovery of item response theory model parameters and measurement precision were examined. The PISA 2018 MSAT…
Descriptors: Adaptive Testing, Test Construction, Achievement Tests, Foreign Countries
von Davier, Matthias; Yamamoto, Kentaro; Shin, Hyo Jeong; Chen, Henry; Khorramdel, Lale; Weeks, Jon; Davis, Scott; Kong, Nan; Kandathil, Mat – Assessment in Education: Principles, Policy & Practice, 2019
Based on concerns about the item response theory (IRT) linking approach used in the Programme for International Student Assessment (PISA) until 2012 as well as the desire to include new, more complex, interactive items with the introduction of computer-based assessments, alternative IRT linking methods were implemented in the 2015 PISA round. The…
Descriptors: Achievement Tests, Foreign Countries, Secondary School Students, International Assessment
Steinmann, Isa; Braeken, Johan; Strietholt, Rolf – AERA Online Paper Repository, 2021
This study investigates consistent and inconsistent respondents to mixed-worded questionnaire scales in large-scale assessments. Mixed-worded scales contain both positively and negatively worded items and are universally applied in different survey and content areas. Due to the changing wording, these scales require a more careful reading and…
Descriptors: Questionnaires, Measurement, Test Items, Response Style (Tests)
Zehner, Fabian; Goldhammer, Frank; Sälzer, Christine – Large-scale Assessments in Education, 2018
Background: The gender gap in reading literacy is repeatedly found in large-scale assessments. This study compared girls' and boys' text responses in a reading test applying natural language processing. For this, a theoretical framework was compiled that allows mapping of response features to the preceding cognitive components such as micro- and…
Descriptors: Reading Comprehension, Gender Differences, Reader Response, Reader Text Relationship
Zehner, Fabian; Goldhammer, Frank; Lubaway, Emily; Sälzer, Christine – Education Inquiry, 2019
In 2015, the "Programme for International Student Assessment" (PISA) introduced multiple changes in its study design, the most extensive being the transition from paper- to computer-based assessment. We investigated the differences between German students' text responses to eight reading items from the paper-based study in 2012 to text…
Descriptors: Foreign Countries, Achievement Tests, International Assessment, Secondary School Students
Debeer, Dries; Janssen, Rianne – AERA Online Paper Repository, 2016
In educational assessments two types of missing responses can be discerned: items can be "not reached" or "skipped". Both types of omissions may be related to the test taker's proficiency, resulting in non-ignorable missingness. This paper proposes to model not reached and skipped items as part of the response process, using…
Descriptors: International Assessment, Foreign Countries, Achievement Tests, Secondary School Students
Yamamoto, Kentaro; He, Qiwei; Shin, Hyo Jeong; von Davier, Mattias – ETS Research Report Series, 2017
Approximately a third of the Programme for International Student Assessment (PISA) items in the core domains (math, reading, and science) are constructed-response items and require human coding (scoring). This process is time-consuming, expensive, and prone to error as often (a) humans code inconsistently, and (b) coding reliability in…
Descriptors: Foreign Countries, Achievement Tests, International Assessment, Secondary School Students
Zumbo, Bruno D.; Liu, Yan; Wu, Amery D.; Shear, Benjamin R.; Olvera Astivia, Oscar L.; Ark, Tavinder K. – Language Assessment Quarterly, 2015
Methods for detecting differential item functioning (DIF) and item bias are typically used in the process of item analysis when developing new measures; adapting existing measures for different populations, languages, or cultures; or more generally validating test score inferences. In 2007 in "Language Assessment Quarterly," Zumbo…
Descriptors: Test Bias, Test Items, Holistic Approach, Models
Sekercioglu, Güçlü; Kogar, Hakan – Novitas-ROYAL (Research on Youth and Language), 2018
The aim of the present study was to examine the measurement invariance (MI) of the reading, mathematics, and science tests in terms of the commonly used languages. It also aimed to examine the differential item functioning (DIF) of the PISA test, the original items of which are in the languages of English and French, in terms of the language…
Descriptors: Error of Measurement, Item Response Theory, International Assessment, Achievement Tests
Rutkowski, Leslie; Rutkowski, David – Scandinavian Journal of Educational Research, 2018
Over time international large-scale assessments have grown in terms of number of studies, cycles, and participating countries, many of which are a heterogeneous mix of economies, languages, cultures, and geography. This heterogeneity has meaningful consequences for comparably measuring both achievement and non-achievement constructs, such as…
Descriptors: Achievement Tests, Foreign Countries, Secondary School Students, International Assessment
Previous Page | Next Page »
Pages: 1 | 2
Peer reviewed
Direct link
