Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 1 |
| Since 2017 (last 10 years) | 4 |
| Since 2007 (last 20 years) | 7 |
Descriptor
| Gender Differences | 7 |
| Test Format | 7 |
| Test Reliability | 7 |
| Computer Assisted Testing | 5 |
| College Students | 4 |
| Test Items | 4 |
| Difficulty Level | 3 |
| Multiple Choice Tests | 3 |
| Racial Differences | 3 |
| Scores | 3 |
| Test Validity | 3 |
| More ▼ | |
Source
| ProQuest LLC | 2 |
| Grantee Submission | 1 |
| Journal of Applied Testing… | 1 |
| Journal of Educational… | 1 |
| Novitas-ROYAL (Research on… | 1 |
| Numeracy | 1 |
Author
| Ben Seipel | 1 |
| Hou, Xiaodong | 1 |
| HyeSun Lee | 1 |
| Jiajun Guo | 1 |
| Jun Xu | 1 |
| Katrina C. Roohr | 1 |
| Lissitz, Robert W. | 1 |
| Ma, Yanxia A. | 1 |
| Mark L. Davison | 1 |
| Marx, Brian D. | 1 |
| Mircea-Pines, Walter J. | 1 |
| More ▼ | |
Publication Type
| Reports - Research | 5 |
| Journal Articles | 4 |
| Dissertations/Theses -… | 2 |
| Numerical/Quantitative Data | 1 |
Education Level
| Higher Education | 5 |
| Postsecondary Education | 4 |
| Elementary Secondary Education | 1 |
| High Schools | 1 |
| Secondary Education | 1 |
Audience
Location
| Connecticut | 1 |
| Louisiana | 1 |
| Maryland | 1 |
| Turkey | 1 |
| Virginia | 1 |
Laws, Policies, & Programs
| Pell Grant Program | 1 |
Assessments and Surveys
| ACT Assessment | 1 |
What Works Clearinghouse Rating
Ben Seipel; Sarah E. Carlson; Virginia Clinton-Lisell; Mark L. Davison; Patrick C. Kennedy – Grantee Submission, 2022
Originally designed for students in Grades 3 through 5, MOCCA (formerly the Multiple-choice Online Causal Comprehension Assessment), identifies students who struggle with comprehension, and helps uncover why they struggle. There are many reasons why students might not comprehend what they read. They may struggle with decoding, or reading words…
Descriptors: Multiple Choice Tests, Computer Assisted Testing, Diagnostic Tests, Reading Tests
Smolinsky, Lawrence; Marx, Brian D.; Olafsson, Gestur; Ma, Yanxia A. – Journal of Educational Computing Research, 2020
Computer-based testing is an expanding use of technology offering advantages to teachers and students. We studied Calculus II classes for science, technology, engineering, and mathematics majors using different testing modes. Three sections with 324 students employed: paper-and-pencil testing, computer-based testing, and both. Computer tests gave…
Descriptors: Test Format, Computer Assisted Testing, Paper (Material), Calculus
Polat, Murat – Novitas-ROYAL (Research on Youth and Language), 2020
Classroom practices, materials and teaching methods in language classes have changed a lot in the last decades and continue to evolve; however, the commonly used techniques to test students' foreign language skills have not changed much regardless of the recent awareness in Bloom's taxonomy. Testing units at schools rely mostly on multiple choice…
Descriptors: Multiple Choice Tests, Test Format, Test Items, Difficulty Level
Jiajun Guo – ProQuest LLC, 2016
Divergent thinking (DT) tests are the most frequently used types of creativity assessment and have been administered in traditional paper and pencil format for more than a half century. With the prevalence of computer-based testing and increasing demands for large-scale, faster, and more flexible testing procedures, it is necessary to explore and…
Descriptors: Test Construction, Computer Assisted Testing, Creative Thinking, Creativity Tests
Katrina C. Roohr; HyeSun Lee; Jun Xu; Ou Lydia Liu; Zhen Wang – Numeracy, 2017
Quantitative literacy has been identified as an important student learning outcome (SLO) by both the higher education and workforce communities. This paper aims to provide preliminary evidence of the psychometric quality of the pilot forms for "HEIghten" quantitative literacy, a next-generation SLO assessment for students in higher…
Descriptors: Psychometrics, Numeracy, Test Items, Item Analysis
Lissitz, Robert W.; Hou, Xiaodong; Slater, Sharon Cadman – Journal of Applied Testing Technology, 2012
This article investigates several questions regarding the impact of different item formats on measurement characteristics. Constructed response (CR) items and multiple choice (MC) items obviously differ in their formats and in the resources needed to score them. As such, they have been the subject of considerable discussion regarding the impact of…
Descriptors: Computer Assisted Testing, Scoring, Evaluation Problems, Psychometrics
Mircea-Pines, Walter J. – ProQuest LLC, 2009
This dissertation study examined the reliability and validity claims of a modified version of the Spanish Modern Language Association Foreign Language Proficiency Test for Teachers and Advanced Students administered at George Mason University (GMU). The study used the 1999 computerized GMU version that was administered to 277 test-takers via…
Descriptors: College Students, Advanced Students, Second Language Learning, Test Validity

Peer reviewed
Direct link
