Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 1 |
| Since 2007 (last 20 years) | 4 |
Descriptor
| Multiple Choice Tests | 4 |
| Program Effectiveness | 4 |
| Test Format | 4 |
| Computer Assisted Testing | 2 |
| Gender Differences | 2 |
| Measurement | 2 |
| Racial Differences | 2 |
| Reading Tests | 2 |
| Scores | 2 |
| Statistical Analysis | 2 |
| Test Items | 2 |
| More ▼ | |
Author
| Cho, YoungWoo | 1 |
| Hou, Xiaodong | 1 |
| In'nami, Yo | 1 |
| Joseph, Dane Christian | 1 |
| Koizumi, Rie | 1 |
| Lissitz, Robert W. | 1 |
| Pashley, Peter | 1 |
| Slater, Sharon Cadman | 1 |
| Steedle, Jeffrey | 1 |
Publication Type
| Journal Articles | 2 |
| Reports - Research | 2 |
| Dissertations/Theses -… | 1 |
| Numerical/Quantitative Data | 1 |
| Reports - Evaluative | 1 |
Education Level
| Elementary Secondary Education | 1 |
| High Schools | 1 |
| Higher Education | 1 |
| Postsecondary Education | 1 |
| Secondary Education | 1 |
Audience
Location
| Maryland | 1 |
Laws, Policies, & Programs
Assessments and Surveys
| ACT Assessment | 1 |
What Works Clearinghouse Rating
Steedle, Jeffrey; Pashley, Peter; Cho, YoungWoo – ACT, Inc., 2020
Three mode comparability studies were conducted on the following Saturday national ACT test dates: October 26, 2019, December 14, 2019, and February 8, 2020. The primary goal of these studies was to evaluate whether ACT scores exhibited mode effects between paper and online testing that would necessitate statistical adjustments to the online…
Descriptors: Test Format, Computer Assisted Testing, College Entrance Examinations, Scores
Joseph, Dane Christian – ProQuest LLC, 2010
Multiple-choice item-writing guideline research is in its infancy. Haladyna (2004) calls for a science of item-writing guideline research. The purpose of this study is to respond to such a call. The purpose of this study was to examine the impact of student ability and method for varying the location of correct answers in classroom multiple-choice…
Descriptors: Evidence, Test Format, Guessing (Tests), Program Effectiveness
In'nami, Yo; Koizumi, Rie – Language Testing, 2009
A meta-analysis was conducted on the effects of multiple-choice and open-ended formats on L1 reading, L2 reading, and L2 listening test performance. Fifty-six data sources located in an extensive search of the literature were the basis for the estimates of the mean effect sizes of test format effects. The results using the mixed effects model of…
Descriptors: Test Format, Listening Comprehension Tests, Multiple Choice Tests, Program Effectiveness
Lissitz, Robert W.; Hou, Xiaodong; Slater, Sharon Cadman – Journal of Applied Testing Technology, 2012
This article investigates several questions regarding the impact of different item formats on measurement characteristics. Constructed response (CR) items and multiple choice (MC) items obviously differ in their formats and in the resources needed to score them. As such, they have been the subject of considerable discussion regarding the impact of…
Descriptors: Computer Assisted Testing, Scoring, Evaluation Problems, Psychometrics

Direct link
Peer reviewed
