Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 2 |
Descriptor
Difficulty Level | 7 |
Test Format | 7 |
Test Items | 6 |
Higher Education | 5 |
Multiple Choice Tests | 4 |
College Students | 2 |
Foreign Countries | 2 |
Item Analysis | 2 |
Performance Factors | 2 |
Test Anxiety | 2 |
Test Construction | 2 |
More ▼ |
Source
Journal of Experimental… | 7 |
Author
Plake, Barbara S. | 2 |
Arce-Ferrer, Alvaro J. | 1 |
Bulut, Okan | 1 |
DiBattista, David | 1 |
Foos, Paul W. | 1 |
Fortuna, Glenda | 1 |
Klimko, Ivan P. | 1 |
Sinnige-Egger, Jo-Anne | 1 |
Weiten, Wayne | 1 |
Publication Type
Journal Articles | 7 |
Reports - Research | 7 |
Education Level
High Schools | 1 |
Higher Education | 1 |
Postsecondary Education | 1 |
Secondary Education | 1 |
Audience
Laws, Policies, & Programs
Assessments and Surveys
State Trait Anxiety Inventory | 1 |
What Works Clearinghouse Rating
Arce-Ferrer, Alvaro J.; Bulut, Okan – Journal of Experimental Education, 2019
This study investigated the performance of four widely used data-collection designs in detecting test-mode effects (i.e., computer-based versus paper-based testing). The experimental conditions included four data-collection designs, two test-administration modes, and the availability of an anchor assessment. The test-level and item-level results…
Descriptors: Data Collection, Test Construction, Test Format, Computer Assisted Testing
DiBattista, David; Sinnige-Egger, Jo-Anne; Fortuna, Glenda – Journal of Experimental Education, 2014
The authors assessed the effects of using "none of the above" as an option in a 40-item, general-knowledge multiple-choice test administered to undergraduate students. Examinees who selected "none of the above" were given an incentive to write the correct answer to the question posed. Using "none of the above" as the…
Descriptors: Multiple Choice Tests, Testing, Undergraduate Students, Test Items

Plake, Barbara S. – Journal of Experimental Education, 1980
Three-item orderings and two levels of knowledge of ordering were used to study differences in test results, student's perception of the test's fairness and difficulty, and student's estimation of test performance. No significant order effect was found. (Author/GK)
Descriptors: Difficulty Level, Higher Education, Scores, Test Format

Weiten, Wayne – Journal of Experimental Education, 1982
A comparison of double as opposed to single multiple-choice questions yielded significant differences in regard to item difficulty, item discrimination, and internal reliability, but not concurrent validity. (Author/PN)
Descriptors: Difficulty Level, Educational Testing, Higher Education, Multiple Choice Tests

Plake, Barbara S.; And Others – Journal of Experimental Education, 1981
Number right and elimination scores were analyzed on a college level mathematics exam assembled from pretest data. Anxiety measures were administered along with the experimental forms to undergraduates. Results suggest that neither test scores nor attitudes are influenced by item order knowledge thereof, or anxiety level. (Author/GK)
Descriptors: College Mathematics, Difficulty Level, Higher Education, Multiple Choice Tests

Foos, Paul W. – Journal of Experimental Education, 1992
Effects of expected form and expected difficulty of a test were examined for 84 college students expecting an easy or difficult multiple-choice or essay examination but taking a combined test. Results support the hypothesis that individuals work harder, rather than reduce their effort, when difficult work is expected. (SLD)
Descriptors: College Students, Difficulty Level, Essay Tests, Expectation

Klimko, Ivan P. – Journal of Experimental Education, 1984
The influence of item arrangement on students' total test performance was investigated. Two hierarchical multiple regression analyses were used to analyze the data. The main finding within the context of this study was that item arrangements based on item difficulties did not influence achievement examination performance. (Author/DWH)
Descriptors: Achievement Tests, Cognitive Style, College Students, Difficulty Level