Publication Date
| In 2026 | 0 |
| Since 2025 | 2 |
| Since 2022 (last 5 years) | 5 |
| Since 2017 (last 10 years) | 10 |
| Since 2007 (last 20 years) | 12 |
Descriptor
| Achievement Tests | 12 |
| Computer Assisted Testing | 12 |
| Foreign Countries | 12 |
| International Assessment | 12 |
| Secondary School Students | 12 |
| Test Items | 12 |
| Accuracy | 4 |
| Reaction Time | 4 |
| Responses | 4 |
| Reading Tests | 3 |
| Test Format | 3 |
| More ▼ | |
Source
Author
| Goldhammer, Frank | 3 |
| Sälzer, Christine | 2 |
| Zehner, Fabian | 2 |
| Andreas Frey | 1 |
| Aron Fink | 1 |
| Buerger, Sarah | 1 |
| Christoph König | 1 |
| Chun Wang | 1 |
| Eklöf, Hanna | 1 |
| Hahnel, Carolin | 1 |
| He, Qiwei | 1 |
| More ▼ | |
Publication Type
| Journal Articles | 11 |
| Reports - Research | 11 |
| Reports - Evaluative | 1 |
Education Level
| Secondary Education | 12 |
| Grade 9 | 1 |
| High Schools | 1 |
| Junior High Schools | 1 |
| Middle Schools | 1 |
Audience
Location
| Germany | 4 |
| Australia | 1 |
| China | 1 |
| China (Shanghai) | 1 |
| Denmark | 1 |
| Finland | 1 |
| France | 1 |
| Ireland | 1 |
| Japan | 1 |
| Latin America | 1 |
| Netherlands | 1 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
| Program for International… | 12 |
What Works Clearinghouse Rating
Andreas Frey; Christoph König; Aron Fink – Journal of Educational Measurement, 2025
The highly adaptive testing (HAT) design is introduced as an alternative test design for the Programme for International Student Assessment (PISA). The principle of HAT is to be as adaptive as possible when selecting items while accounting for PISA's nonstatistical constraints and addressing issues concerning PISA such as item position effects.…
Descriptors: Adaptive Testing, Test Construction, Alternative Assessment, Achievement Tests
Xiuxiu Tang; Yi Zheng; Tong Wu; Kit-Tai Hau; Hua-Hua Chang – Journal of Educational Measurement, 2025
Multistage adaptive testing (MST) has been recently adopted for international large-scale assessments such as Programme for International Student Assessment (PISA). MST offers improved measurement efficiency over traditional nonadaptive tests and improved practical convenience over single-item-adaptive computerized adaptive testing (CAT). As a…
Descriptors: Reaction Time, Test Items, Achievement Tests, Foreign Countries
Kuang, Huan; Sahin, Fusun – Large-scale Assessments in Education, 2023
Background: Examinees may not make enough effort when responding to test items if the assessment has no consequence for them. These disengaged responses can be problematic in low-stakes, large-scale assessments because they can bias item parameter estimates. However, the amount of bias, and whether this bias is similar across administrations, is…
Descriptors: Test Items, Comparative Analysis, Mathematics Tests, Reaction Time
A Sequential Bayesian Changepoint Detection Procedure for Aberrant Behaviors in Computerized Testing
Jing Lu; Chun Wang; Jiwei Zhang; Xue Wang – Grantee Submission, 2023
Changepoints are abrupt variations in a sequence of data in statistical inference. In educational and psychological assessments, it is pivotal to properly differentiate examinees' aberrant behaviors from solution behavior to ensure test reliability and validity. In this paper, we propose a sequential Bayesian changepoint detection algorithm to…
Descriptors: Bayesian Statistics, Behavior Patterns, Computer Assisted Testing, Accuracy
Lundgren, Erik; Eklöf, Hanna – Educational Research and Evaluation, 2020
The present study used process data from a computer-based problem-solving task as indications of behavioural level of test-taking effort, and explored how behavioural item-level effort related to overall test performance and self-reported effort. Variables were extracted from raw process data and clustered. Four distinct clusters were obtained and…
Descriptors: Computer Assisted Testing, Problem Solving, Response Style (Tests), Test Items
Rivas, Axel; Scasso, Martín Guillermo – Journal of Education Policy, 2021
Since 2000, the PISA test implemented by OECD has become the prime benchmark for international comparisons in education. The 2015 PISA edition introduced methodological changes that altered the nature of its results. PISA made no longer valid non-reached items of the final part of the test, assuming that those unanswered questions were more a…
Descriptors: Test Validity, Computer Assisted Testing, Foreign Countries, Achievement Tests
Kroehne, Ulf; Buerger, Sarah; Hahnel, Carolin; Goldhammer, Frank – Educational Measurement: Issues and Practice, 2019
For many years, reading comprehension in the Programme for International Student Assessment (PISA) was measured via paper-based assessment (PBA). In the 2015 cycle, computer-based assessment (CBA) was introduced, raising the question of whether central equivalence criteria required for a valid interpretation of the results are fulfilled. As an…
Descriptors: Reading Comprehension, Computer Assisted Testing, Achievement Tests, Foreign Countries
Lehane, Paula; Scully, Darina; O'Leary, Michael – Irish Educational Studies, 2022
In line with the widespread proliferation of digital technology in everyday life, many countries are now beginning to use computer-based exams (CBEs) in their post-primary education systems. To ensure that these CBEs are delivered in a manner that preserves their fairness, validity, utility and credibility, several factors pertaining to their…
Descriptors: Computer Assisted Testing, Secondary School Students, Culture Fair Tests, Test Validity
Zehner, Fabian; Goldhammer, Frank; Lubaway, Emily; Sälzer, Christine – Education Inquiry, 2019
In 2015, the "Programme for International Student Assessment" (PISA) introduced multiple changes in its study design, the most extensive being the transition from paper- to computer-based assessment. We investigated the differences between German students' text responses to eight reading items from the paper-based study in 2012 to text…
Descriptors: Foreign Countries, Achievement Tests, International Assessment, Secondary School Students
Yamamoto, Kentaro; He, Qiwei; Shin, Hyo Jeong; von Davier, Mattias – ETS Research Report Series, 2017
Approximately a third of the Programme for International Student Assessment (PISA) items in the core domains (math, reading, and science) are constructed-response items and require human coding (scoring). This process is time-consuming, expensive, and prone to error as often (a) humans code inconsistently, and (b) coding reliability in…
Descriptors: Foreign Countries, Achievement Tests, International Assessment, Secondary School Students
Jerrim, John – Assessment in Education: Principles, Policy & Practice, 2016
The Programme for International Assessment (PISA) is an important cross-national study of 15-year olds academic achievement. Although it has traditionally been conducted using paper-and-pencil tests, the vast majority of countries will use computer-based assessment from 2015. In this paper, we consider how cross-country comparisons of children's…
Descriptors: Foreign Countries, Achievement Tests, International Assessment, Secondary School Students
Zehner, Fabian; Sälzer, Christine; Goldhammer, Frank – Educational and Psychological Measurement, 2016
Automatic coding of short text responses opens new doors in assessment. We implemented and integrated baseline methods of natural language processing and statistical modelling by means of software components that are available under open licenses. The accuracy of automatic text coding is demonstrated by using data collected in the "Programme…
Descriptors: Educational Assessment, Coding, Automation, Responses

Peer reviewed
Direct link
