Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 2 |
Since 2016 (last 10 years) | 4 |
Since 2006 (last 20 years) | 4 |
Descriptor
Author
Huntley, Renee M. | 3 |
Li, Dongmei | 2 |
Steedle, Jeffrey | 2 |
Cho, YoungWoo | 1 |
Harris, Deborah | 1 |
Hildenbrand, Lena | 1 |
Pashley, Peter | 1 |
Plake, Barbara S. | 1 |
Wang, Shichao | 1 |
Welch, Catherine J. | 1 |
Wiley, Jennifer | 1 |
More ▼ |
Publication Type
Reports - Research | 6 |
Speeches/Meeting Papers | 3 |
Numerical/Quantitative Data | 2 |
Journal Articles | 1 |
Education Level
Higher Education | 4 |
Postsecondary Education | 3 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
ACT Assessment | 7 |
Gates MacGinitie Reading Tests | 1 |
What Works Clearinghouse Rating
Wang, Shichao; Li, Dongmei; Steedle, Jeffrey – ACT, Inc., 2021
Speeded tests set time limits so that few examinees can reach all items, and power tests allow most test-takers sufficient time to attempt all items. Educational achievement tests are sometimes described as "timed power tests" because the amount of time provided is intended to allow nearly all students to complete the test, yet this…
Descriptors: Timed Tests, Test Items, Achievement Tests, Testing
Hildenbrand, Lena; Wiley, Jennifer – Grantee Submission, 2021
Many studies have demonstrated that testing students on to-be-learned materials can be an effective learning activity. However, past studies have also shown that some practice test formats are more effective than others. Open-ended recall or short answer practice tests may be effective because the questions prompt deeper processing as students…
Descriptors: Test Format, Outcomes of Education, Cognitive Processes, Learning Activities
Steedle, Jeffrey; Pashley, Peter; Cho, YoungWoo – ACT, Inc., 2020
Three mode comparability studies were conducted on the following Saturday national ACT test dates: October 26, 2019, December 14, 2019, and February 8, 2020. The primary goal of these studies was to evaluate whether ACT scores exhibited mode effects between paper and online testing that would necessitate statistical adjustments to the online…
Descriptors: Test Format, Computer Assisted Testing, College Entrance Examinations, Scores
Li, Dongmei; Yi, Qing; Harris, Deborah – ACT, Inc., 2017
In preparation for online administration of the ACT® test, ACT conducted studies to examine the comparability of scores between online and paper administrations, including a timing study in fall 2013, a mode comparability study in spring 2014, and a second mode comparability study in spring 2015. This report presents major findings from these…
Descriptors: College Entrance Examinations, Computer Assisted Testing, Comparative Analysis, Test Format

Plake, Barbara S.; Huntley, Renee M. – Educational and Psychological Measurement, 1984
Two studies examined the effect of making the correct answer of a multiple choice test item grammatically consistent with the item. American College Testing Assessment experimental items were constructed to investigate grammatical compliance to investigate grammatical compliance for plural-singular and vowel-consonant agreement. Results suggest…
Descriptors: Grammar, Higher Education, Item Analysis, Multiple Choice Tests
Huntley, Renee M.; And Others – 1990
This study investigated the effect of diagram formats on performance on geometry items in order to determine whether certain examinees are affected by different item formats and whether such differences arise from the different intellectual demands made by these formats. Thirty-two experimental, multiple-choice geometry items were administered in…
Descriptors: College Bound Students, College Entrance Examinations, Comparative Testing, Diagrams
Huntley, Renee M.; Welch, Catherine J. – 1993
Writers of mathematics test items, especially those who write for standardized tests, are often advised to arrange the answer options in logical order, usually ascending or descending numerical order. In this study, 32 mathematics items were selected for inclusion in four experimental pretest units, each consisting of 16 items. Two versions…
Descriptors: Ability, College Entrance Examinations, Comparative Testing, Distractors (Tests)