Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 0 |
| Since 2007 (last 20 years) | 1 |
Descriptor
| Cognitive Processes | 2 |
| Reading Comprehension | 2 |
| Test Format | 2 |
| Test Items | 2 |
| College Students | 1 |
| Computer Assisted Testing | 1 |
| Difficulty Level | 1 |
| High School Students | 1 |
| Higher Education | 1 |
| Item Response Theory | 1 |
| Models | 1 |
| More ▼ | |
Source
| Applied Measurement in… | 2 |
Publication Type
| Journal Articles | 2 |
| Reports - Research | 2 |
Education Level
| High Schools | 1 |
| Secondary Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Using Necessary Information to Identify Item Dependence in Passage-Based Reading Comprehension Tests
Baldonado, Angela Argo; Svetina, Dubravka; Gorin, Joanna – Applied Measurement in Education, 2015
Applications of traditional unidimensional item response theory models to passage-based reading comprehension assessment data have been criticized based on potential violations of local independence. However, simple rules for determining dependency, such as including all items associated with a particular passage, may overestimate the dependency…
Descriptors: Reading Tests, Reading Comprehension, Test Items, Item Response Theory
Peer reviewedKobrin, Jennifer L.; Young, John W. – Applied Measurement in Education, 2003
Studied the cognitive equivalence of computerized and paper-and-pencil reading comprehension tests using verbal protocol analysis. Results for 48 college students indicate that the only significant difference between the computerized and paper-and-pencil tests was in the frequency of identifying important information in the passage. (SLD)
Descriptors: Cognitive Processes, College Students, Computer Assisted Testing, Difficulty Level

Direct link
