Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 1 |
| Since 2007 (last 20 years) | 1 |
Descriptor
| Accuracy | 1 |
| Achievement Tests | 1 |
| Computer Assisted Testing | 1 |
| Efficiency | 1 |
| Foreign Countries | 1 |
| International Assessment | 1 |
| Interrater Reliability | 1 |
| Reading Tests | 1 |
| Responses | 1 |
| Scoring | 1 |
| Secondary School Students | 1 |
| More ▼ | |
Source
| ETS Research Report Series | 1 |
Publication Type
| Journal Articles | 1 |
| Reports - Research | 1 |
Education Level
| Secondary Education | 1 |
Audience
Location
| Australia | 1 |
| China | 1 |
| France | 1 |
| Germany | 1 |
| Japan | 1 |
| Netherlands | 1 |
| South Korea | 1 |
Laws, Policies, & Programs
Assessments and Surveys
| Program for International… | 1 |
What Works Clearinghouse Rating
Yamamoto, Kentaro; He, Qiwei; Shin, Hyo Jeong; von Davier, Mattias – ETS Research Report Series, 2017
Approximately a third of the Programme for International Student Assessment (PISA) items in the core domains (math, reading, and science) are constructed-response items and require human coding (scoring). This process is time-consuming, expensive, and prone to error as often (a) humans code inconsistently, and (b) coding reliability in…
Descriptors: Foreign Countries, Achievement Tests, International Assessment, Secondary School Students

Peer reviewed
