Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 2 |
| Since 2007 (last 20 years) | 3 |
Descriptor
Source
| ACT, Inc. | 2 |
| Journal of Educational… | 1 |
Author
| Dandotkar, Srikanth | 1 |
| Gilliam, Sara | 1 |
| Harris, Deborah | 1 |
| Kurby, Christopher A. | 1 |
| Li, Dongmei | 1 |
| Magliano, Joseph P. | 1 |
| McNamara, Danielle S. | 1 |
| Steedle, Jeffrey | 1 |
| Wang, Lu | 1 |
| Woehrle, James | 1 |
| Yi, Qing | 1 |
| More ▼ | |
Publication Type
| Reports - Research | 3 |
| Journal Articles | 1 |
| Numerical/Quantitative Data | 1 |
Education Level
| Higher Education | 3 |
| Postsecondary Education | 2 |
| Elementary Secondary Education | 1 |
| High Schools | 1 |
| Secondary Education | 1 |
Audience
Location
| Illinois | 1 |
Laws, Policies, & Programs
Assessments and Surveys
| ACT Assessment | 3 |
| Gates MacGinitie Reading Tests | 1 |
What Works Clearinghouse Rating
Wang, Lu; Steedle, Jeffrey – ACT, Inc., 2020
In recent ACT mode comparability studies, students testing on laptop or desktop computers earned slightly higher scores on average than students who tested on paper, especially on the ACT® reading and English tests (Li et al., 2017). Equating procedures adjust for such "mode effects" to make ACT scores comparable regardless of testing…
Descriptors: Test Format, Reading Tests, Language Tests, English
Li, Dongmei; Yi, Qing; Harris, Deborah – ACT, Inc., 2017
In preparation for online administration of the ACT® test, ACT conducted studies to examine the comparability of scores between online and paper administrations, including a timing study in fall 2013, a mode comparability study in spring 2014, and a second mode comparability study in spring 2015. This report presents major findings from these…
Descriptors: College Entrance Examinations, Computer Assisted Testing, Comparative Analysis, Test Format
Kurby, Christopher A.; Magliano, Joseph P.; Dandotkar, Srikanth; Woehrle, James; Gilliam, Sara; McNamara, Danielle S. – Journal of Educational Computing Research, 2012
This study assessed whether and how self-explanation reading training, provided by iSTART (Interactive Strategy Training for Active Reading and Thinking), improves the effectiveness of comprehension processes. iSTART teaches students how to self-explain and which strategies will most effectively aid comprehension from moment-to-moment. We used…
Descriptors: Computer Assisted Testing, Federal Aid, Control Groups, Experimental Groups

Peer reviewed
Direct link
