Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 2 |
Descriptor
Multiple Choice Tests | 10 |
Test Format | 10 |
Higher Education | 7 |
Test Items | 5 |
College Students | 4 |
Difficulty Level | 4 |
Test Reliability | 3 |
Test Validity | 3 |
Comparative Analysis | 2 |
Educational Assessment | 2 |
Foreign Countries | 2 |
More ▼ |
Source
Journal of Experimental… | 10 |
Author
Crehan, Kevin | 1 |
Crocker, Linda | 1 |
DiBattista, David | 1 |
Diedenhofen, Birk | 1 |
Foos, Paul W. | 1 |
Fortuna, Glenda | 1 |
Green, Kathy | 1 |
Haladyna, Thomas M. | 1 |
Hancock, Gregory R. | 1 |
Miller, M. David | 1 |
Musch, Jochen | 1 |
More ▼ |
Publication Type
Journal Articles | 10 |
Reports - Research | 10 |
Tests/Questionnaires | 1 |
Education Level
Higher Education | 2 |
Postsecondary Education | 2 |
Audience
Laws, Policies, & Programs
Assessments and Surveys
State Trait Anxiety Inventory | 1 |
What Works Clearinghouse Rating
Papenberg, Martin; Diedenhofen, Birk; Musch, Jochen – Journal of Experimental Education, 2021
Testwiseness may introduce construct-irrelevant variance to multiple-choice test scores. Presenting response options sequentially has been proposed as a potential solution to this problem. In an experimental validation, we determined the psychometric properties of a test based on the sequential presentation of response options. We created a strong…
Descriptors: Test Wiseness, Test Validity, Test Reliability, Multiple Choice Tests
DiBattista, David; Sinnige-Egger, Jo-Anne; Fortuna, Glenda – Journal of Experimental Education, 2014
The authors assessed the effects of using "none of the above" as an option in a 40-item, general-knowledge multiple-choice test administered to undergraduate students. Examinees who selected "none of the above" were given an incentive to write the correct answer to the question posed. Using "none of the above" as the…
Descriptors: Multiple Choice Tests, Testing, Undergraduate Students, Test Items

Crehan, Kevin; Haladyna, Thomas M. – Journal of Experimental Education, 1991
Two item-writing rules were tested: phrasing stems as questions versus partial sentences; and using the "none-of-the-above" option instead of a specific content option. Results with 228 college students do not support the use of either stem type and provide limited evidence to caution against the "none-of-the-above" option.…
Descriptors: College Students, Higher Education, Multiple Choice Tests, Test Construction

Weiten, Wayne – Journal of Experimental Education, 1982
A comparison of double as opposed to single multiple-choice questions yielded significant differences in regard to item difficulty, item discrimination, and internal reliability, but not concurrent validity. (Author/PN)
Descriptors: Difficulty Level, Educational Testing, Higher Education, Multiple Choice Tests

Green, Kathy – Journal of Experimental Education, 1979
Reliabilities and concurrent validities of teacher-made multiple-choice and true-false tests were compared. No significant differences were found even when multiple-choice reliability was adjusted to equate testing time. (Author/MH)
Descriptors: Comparative Testing, Higher Education, Multiple Choice Tests, Test Format

Pajares, Frank; Miller, M. David – Journal of Experimental Education, 1997
The mathematics self-efficacy and problem-solving performance of 327 middle school students were assessed with multiple-choice and open-ended methods. No differences in self-efficacy resulted from the different forms of assessment, although those who took the multiple-choice test had higher scores and better calibration of ability. (SLD)
Descriptors: Ability, Educational Assessment, Mathematics, Middle School Students

Plake, Barbara S.; And Others – Journal of Experimental Education, 1981
Number right and elimination scores were analyzed on a college level mathematics exam assembled from pretest data. Anxiety measures were administered along with the experimental forms to undergraduates. Results suggest that neither test scores nor attitudes are influenced by item order knowledge thereof, or anxiety level. (Author/GK)
Descriptors: College Mathematics, Difficulty Level, Higher Education, Multiple Choice Tests

Foos, Paul W. – Journal of Experimental Education, 1992
Effects of expected form and expected difficulty of a test were examined for 84 college students expecting an easy or difficult multiple-choice or essay examination but taking a combined test. Results support the hypothesis that individuals work harder, rather than reduce their effort, when difficult work is expected. (SLD)
Descriptors: College Students, Difficulty Level, Essay Tests, Expectation
Cognitive Complexity and the Comparability of Multiple-Choice and Constructed-Response Test Formats.

Hancock, Gregory R. – Journal of Experimental Education, 1994
To investigate the ability of multiple-choice tests to assess higher order thinking skills, examinations were constructed as half multiple choice and half constructed response. Results with 90 undergraduate and graduate students indicate that the 2 formats measure similar constructs at different levels of complexity. (SLD)
Descriptors: Cognitive Processes, Comparative Analysis, Constructed Response, Educational Assessment

Crocker, Linda; Schmitt, Alicia – Journal of Experimental Education, 1987
The effectiveness of a strategy for improving performance on multiple-choice items for examinees was assessed using an aptitude-treatment interaction model. Results showed that for low-anxious examinees, generation of an answer before selecting a response led to higher performance, but not for highly anxious examinees. (Author/JAZ)
Descriptors: Achievement Tests, Aptitude Treatment Interaction, Behavior Rating Scales, College Students