Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 1 |
| Since 2017 (last 10 years) | 2 |
| Since 2007 (last 20 years) | 3 |
Descriptor
Author
| Ben Seipel | 1 |
| Cho, YoungWoo | 1 |
| Hou, Xiaodong | 1 |
| Lawrence, Ida M. | 1 |
| Lissitz, Robert W. | 1 |
| Mark L. Davison | 1 |
| Myerberg, N. James | 1 |
| Pashley, Peter | 1 |
| Patrick C. Kennedy | 1 |
| Sarah E. Carlson | 1 |
| Scheuneman, Janice | 1 |
| More ▼ | |
Publication Type
| Reports - Research | 4 |
| Numerical/Quantitative Data | 2 |
| Reports - Evaluative | 2 |
| Journal Articles | 1 |
| Speeches/Meeting Papers | 1 |
Education Level
| Higher Education | 2 |
| Postsecondary Education | 2 |
| Elementary Secondary Education | 1 |
| High Schools | 1 |
| Secondary Education | 1 |
Audience
| Researchers | 1 |
Location
| Maryland | 1 |
Laws, Policies, & Programs
Assessments and Surveys
| ACT Assessment | 1 |
| Graduate Record Examinations | 1 |
| SAT (College Admission Test) | 1 |
What Works Clearinghouse Rating
Ben Seipel; Sarah E. Carlson; Virginia Clinton-Lisell; Mark L. Davison; Patrick C. Kennedy – Grantee Submission, 2022
Originally designed for students in Grades 3 through 5, MOCCA (formerly the Multiple-choice Online Causal Comprehension Assessment), identifies students who struggle with comprehension, and helps uncover why they struggle. There are many reasons why students might not comprehend what they read. They may struggle with decoding, or reading words…
Descriptors: Multiple Choice Tests, Computer Assisted Testing, Diagnostic Tests, Reading Tests
Steedle, Jeffrey; Pashley, Peter; Cho, YoungWoo – ACT, Inc., 2020
Three mode comparability studies were conducted on the following Saturday national ACT test dates: October 26, 2019, December 14, 2019, and February 8, 2020. The primary goal of these studies was to evaluate whether ACT scores exhibited mode effects between paper and online testing that would necessitate statistical adjustments to the online…
Descriptors: Test Format, Computer Assisted Testing, College Entrance Examinations, Scores
Lissitz, Robert W.; Hou, Xiaodong; Slater, Sharon Cadman – Journal of Applied Testing Technology, 2012
This article investigates several questions regarding the impact of different item formats on measurement characteristics. Constructed response (CR) items and multiple choice (MC) items obviously differ in their formats and in the resources needed to score them. As such, they have been the subject of considerable discussion regarding the impact of…
Descriptors: Computer Assisted Testing, Scoring, Evaluation Problems, Psychometrics
PDF pending restorationLawrence, Ida M.; And Others – 1995
This research summarizes differential item functioning (DIF) results for student produced response (SPR) items, a nonmultiple-choice mathematical item type in the Scholastic Aptitude Test I (SAT I). DIF data from 4 field trial pretest administrations (620 SPR items) and 10 final forms (100 SPR items with samples ranging from about 58,000 to over…
Descriptors: Black Students, Comparative Analysis, Item Bias, Mathematics Tests
Myerberg, N. James – 1996
As is consistent with national trends, the Montgomery County (Maryland) Public School System is exploring the use of instruments other than multiple-choice tests for high-stakes testing. This paper presents information on racial, ethnic, and gender differences in performance on the various types of tests being administered in the district. Sharing…
Descriptors: Achievement Tests, Constructed Response, Educational Assessment, Elementary Education
Scheuneman, Janice – 1985
A number of hypotheses were tested concerning elements of Graduate Record Examinations (GRE) items that might affect the performance of blacks and whites differently. These elements were characteristics common to several items that otherwise measured different concepts. Seven general hypotheses were tested in the form of sixteen specific…
Descriptors: Black Students, College Entrance Examinations, Graduate Study, Higher Education

Peer reviewed
Direct link
