Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 0 |
| Since 2007 (last 20 years) | 1 |
Descriptor
| Classification | 1 |
| Computation | 1 |
| Data Analysis | 1 |
| Decision Making | 1 |
| Elementary Education | 1 |
| Flow Charts | 1 |
| Grade 5 | 1 |
| High Stakes Tests | 1 |
| Interrater Reliability | 1 |
| Monte Carlo Methods | 1 |
| Reliability | 1 |
| More ▼ | |
Source
| Psychology in the Schools | 1 |
Publication Type
| Journal Articles | 1 |
| Reports - Descriptive | 1 |
Education Level
| Elementary Education | 1 |
| Grade 5 | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Smith, Stacey L.; Vannest, Kimberly J.; Davis, John L. – Psychology in the Schools, 2011
The reliability of data is a critical issue in decision-making for practitioners in the school. Percent Agreement and Cohen's kappa are the two most widely reported indices of inter-rater reliability, however, a recent Monte Carlo study on the reliability of multi-category scales found other indices to be more trustworthy given the type of data…
Descriptors: Monte Carlo Methods, Interrater Reliability, Flow Charts, Computation

Peer reviewed
Direct link
