NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Researchers1
Laws, Policies, & Programs
Pell Grant Program1
What Works Clearinghouse Rating
Showing 1 to 15 of 17 results Save | Export
Jeff Allen; Jay Thomas; Stacy Dreyer; Scott Johanningmeier; Dana Murano; Ty Cruce; Xin Li; Edgar Sanchez – ACT Education Corp., 2025
This report describes the process of developing and validating the enhanced ACT. The report describes the changes made to the test content and the processes by which these design decisions were implemented. The authors describe how they shared the overall scope of the enhancements, including the initial blueprints, with external expert panels,…
Descriptors: College Entrance Examinations, Testing, Change, Test Construction
Dongmei Li; Shalini Kapoor; Ann Arthur; Chi-Yu Huang; YoungWoo Cho; Chen Qiu; Hongling Wang – ACT Education Corp., 2025
Starting in April 2025, ACT will introduce enhanced forms of the ACT® test for national online testing, with a full rollout to all paper and online test takers in national, state and district, and international test administrations by Spring 2026. ACT introduced major updates by changing the test lengths and testing times, providing more time per…
Descriptors: College Entrance Examinations, Testing, Change, Scoring
Allen, Jeff – ACT, Inc., 2022
The COVID-19 pandemic caused widespread disruptions to the educational system in Arkansas and across the United States. At the onset of the pandemic in March 2020, schools in Arkansas were forced to replace on-site instruction with virtual instruction. During the 2020-2021 academic year, there were three student instructional options:…
Descriptors: COVID-19, Pandemics, Academic Achievement, Electronic Learning
Wang, Shichao; Li, Dongmei; Steedle, Jeffrey – ACT, Inc., 2021
Speeded tests set time limits so that few examinees can reach all items, and power tests allow most test-takers sufficient time to attempt all items. Educational achievement tests are sometimes described as "timed power tests" because the amount of time provided is intended to allow nearly all students to complete the test, yet this…
Descriptors: Timed Tests, Test Items, Achievement Tests, Testing
Wang, Lu; Steedle, Jeffrey – ACT, Inc., 2020
In recent ACT mode comparability studies, students testing on laptop or desktop computers earned slightly higher scores on average than students who tested on paper, especially on the ACT® reading and English tests (Li et al., 2017). Equating procedures adjust for such "mode effects" to make ACT scores comparable regardless of testing…
Descriptors: Test Format, Reading Tests, Language Tests, English
Hildenbrand, Lena; Wiley, Jennifer – Grantee Submission, 2021
Many studies have demonstrated that testing students on to-be-learned materials can be an effective learning activity. However, past studies have also shown that some practice test formats are more effective than others. Open-ended recall or short answer practice tests may be effective because the questions prompt deeper processing as students…
Descriptors: Test Format, Outcomes of Education, Cognitive Processes, Learning Activities
Peer reviewed Peer reviewed
Direct linkDirect link
Smolinsky, Lawrence; Marx, Brian D.; Olafsson, Gestur; Ma, Yanxia A. – Journal of Educational Computing Research, 2020
Computer-based testing is an expanding use of technology offering advantages to teachers and students. We studied Calculus II classes for science, technology, engineering, and mathematics majors using different testing modes. Three sections with 324 students employed: paper-and-pencil testing, computer-based testing, and both. Computer tests gave…
Descriptors: Test Format, Computer Assisted Testing, Paper (Material), Calculus
Steedle, Jeffrey; Pashley, Peter; Cho, YoungWoo – ACT, Inc., 2020
Three mode comparability studies were conducted on the following Saturday national ACT test dates: October 26, 2019, December 14, 2019, and February 8, 2020. The primary goal of these studies was to evaluate whether ACT scores exhibited mode effects between paper and online testing that would necessitate statistical adjustments to the online…
Descriptors: Test Format, Computer Assisted Testing, College Entrance Examinations, Scores
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Koretz, Daniel; Yu, Carol; Mbekeani, Preeya P.; Langi, Meredith; Dhaliwal, Tasmin; Braslow, David – AERA Open, 2016
The current focus on assessing "college and career readiness" raises an empirical question: How do high school tests compare with college admissions tests in predicting performance in college? We explored this using data from the City University of New York and public colleges in Kentucky. These two systems differ in the choice of…
Descriptors: Predictor Variables, College Freshmen, Grade Point Average, College Entrance Examinations
Li, Dongmei; Yi, Qing; Harris, Deborah – ACT, Inc., 2017
In preparation for online administration of the ACT® test, ACT conducted studies to examine the comparability of scores between online and paper administrations, including a timing study in fall 2013, a mode comparability study in spring 2014, and a second mode comparability study in spring 2015. This report presents major findings from these…
Descriptors: College Entrance Examinations, Computer Assisted Testing, Comparative Analysis, Test Format
Peer reviewed Peer reviewed
Plake, Barbara S.; Huntley, Renee M. – Educational and Psychological Measurement, 1984
Two studies examined the effect of making the correct answer of a multiple choice test item grammatically consistent with the item. American College Testing Assessment experimental items were constructed to investigate grammatical compliance to investigate grammatical compliance for plural-singular and vowel-consonant agreement. Results suggest…
Descriptors: Grammar, Higher Education, Item Analysis, Multiple Choice Tests
Plake, Barbara S.; And Others – 1981
Effects of item arrangement (easy-hard, uniform, and random), test anxiety, and sex on a 48-item multiple-choice mathematics test assembled from items of the American College Testing Program and taken by motivated upper level undergraduates and beginning graduate students were investigated. Four measures of anxiety were used: the Achievement Test…
Descriptors: Academic Achievement, Achievement Tests, Difficulty Level, Higher Education
Huntley, Renee M.; Miller, Sherri – 1994
Whether the shaping of test items can itself result in qualitative differences in examinees' comprehension of reading passages was studied using the Pearson-Johnson item classification system. The specific practice studied incorporated, within an item stem line, references that point the examinee to a specific location within a reading passage.…
Descriptors: Ability, Classification, Difficulty Level, High School Students
Huntley, Renee M.; And Others – 1990
This study investigated the effect of diagram formats on performance on geometry items in order to determine whether certain examinees are affected by different item formats and whether such differences arise from the different intellectual demands made by these formats. Thirty-two experimental, multiple-choice geometry items were administered in…
Descriptors: College Bound Students, College Entrance Examinations, Comparative Testing, Diagrams
Peer reviewed Peer reviewed
Harris, Deborah J. – Applied Psychological Measurement, 1991
Effects of passage and item-scrambling on equipercentile and item-response theory equating were investigated using 2 scrambled versions of the American College Testing Program Assessment for approximately 25,000 examinees. Results indicate that using a base-form conversion table with a scrambled form affects the individual examinee level. (SLD)
Descriptors: College Entrance Examinations, Comparative Testing, Context Effect, Equated Scores
Previous Page | Next Page »
Pages: 1  |  2