Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 2 |
Descriptor
Test Format | 11 |
Test Items | 11 |
Higher Education | 7 |
Difficulty Level | 6 |
Multiple Choice Tests | 5 |
Foreign Countries | 3 |
Performance Factors | 3 |
Test Construction | 3 |
Test Validity | 3 |
College Students | 2 |
Item Analysis | 2 |
More ▼ |
Source
Journal of Experimental… | 11 |
Author
Plake, Barbara S. | 2 |
Arce-Ferrer, Alvaro J. | 1 |
Bulut, Okan | 1 |
Crehan, Kevin | 1 |
DiBattista, David | 1 |
Fortuna, Glenda | 1 |
Green, Kathy | 1 |
Haladyna, Thomas M. | 1 |
Katz, Barry M. | 1 |
Klimko, Ivan P. | 1 |
McSweeney, Maryellen | 1 |
More ▼ |
Publication Type
Journal Articles | 11 |
Reports - Research | 11 |
Education Level
High Schools | 1 |
Higher Education | 1 |
Postsecondary Education | 1 |
Secondary Education | 1 |
Audience
Location
Canada | 1 |
Mexico | 1 |
Netherlands | 1 |
Laws, Policies, & Programs
Assessments and Surveys
State Trait Anxiety Inventory | 1 |
What Works Clearinghouse Rating
Arce-Ferrer, Alvaro J.; Bulut, Okan – Journal of Experimental Education, 2019
This study investigated the performance of four widely used data-collection designs in detecting test-mode effects (i.e., computer-based versus paper-based testing). The experimental conditions included four data-collection designs, two test-administration modes, and the availability of an anchor assessment. The test-level and item-level results…
Descriptors: Data Collection, Test Construction, Test Format, Computer Assisted Testing
DiBattista, David; Sinnige-Egger, Jo-Anne; Fortuna, Glenda – Journal of Experimental Education, 2014
The authors assessed the effects of using "none of the above" as an option in a 40-item, general-knowledge multiple-choice test administered to undergraduate students. Examinees who selected "none of the above" were given an incentive to write the correct answer to the question posed. Using "none of the above" as the…
Descriptors: Multiple Choice Tests, Testing, Undergraduate Students, Test Items

Plake, Barbara S. – Journal of Experimental Education, 1980
Three-item orderings and two levels of knowledge of ordering were used to study differences in test results, student's perception of the test's fairness and difficulty, and student's estimation of test performance. No significant order effect was found. (Author/GK)
Descriptors: Difficulty Level, Higher Education, Scores, Test Format

Crehan, Kevin; Haladyna, Thomas M. – Journal of Experimental Education, 1991
Two item-writing rules were tested: phrasing stems as questions versus partial sentences; and using the "none-of-the-above" option instead of a specific content option. Results with 228 college students do not support the use of either stem type and provide limited evidence to caution against the "none-of-the-above" option.…
Descriptors: College Students, Higher Education, Multiple Choice Tests, Test Construction

Katz, Barry M.; McSweeney, Maryellen – Journal of Experimental Education, 1984
This paper developed and illustrated a technique to analyze categorical data when subjects can appear in any number of categories for multigroup designs. Post hoc procedures to be used in conjunction with the presented statistical test are also developed. The technique is a large sample technique whose small sample properties are as yet unknown.…
Descriptors: Data Analysis, Hypothesis Testing, Mathematical Models, Research Methodology

Weiten, Wayne – Journal of Experimental Education, 1982
A comparison of double as opposed to single multiple-choice questions yielded significant differences in regard to item difficulty, item discrimination, and internal reliability, but not concurrent validity. (Author/PN)
Descriptors: Difficulty Level, Educational Testing, Higher Education, Multiple Choice Tests

Green, Kathy – Journal of Experimental Education, 1979
Reliabilities and concurrent validities of teacher-made multiple-choice and true-false tests were compared. No significant differences were found even when multiple-choice reliability was adjusted to equate testing time. (Author/MH)
Descriptors: Comparative Testing, Higher Education, Multiple Choice Tests, Test Format

Schraw, Gregory – Journal of Experimental Education, 1997
The basis of students' confidence in their answers to test items was studied with 95 undergraduates. Results support the domain-general hypothesis that predicts that confidence judgments will be related to performance on a particular test and also to confidence judgments and performance on unrelated tests. (SLD)
Descriptors: Higher Education, Metacognition, Performance Factors, Scores

Plake, Barbara S.; And Others – Journal of Experimental Education, 1981
Number right and elimination scores were analyzed on a college level mathematics exam assembled from pretest data. Anxiety measures were administered along with the experimental forms to undergraduates. Results suggest that neither test scores nor attitudes are influenced by item order knowledge thereof, or anxiety level. (Author/GK)
Descriptors: College Mathematics, Difficulty Level, Higher Education, Multiple Choice Tests

Klimko, Ivan P. – Journal of Experimental Education, 1984
The influence of item arrangement on students' total test performance was investigated. Two hierarchical multiple regression analyses were used to analyze the data. The main finding within the context of this study was that item arrangements based on item difficulties did not influence achievement examination performance. (Author/DWH)
Descriptors: Achievement Tests, Cognitive Style, College Students, Difficulty Level

Peeck, J.; Tillema, H. H. – Journal of Experimental Education, 1978
Subjects were immediately tested on a reading passage, and received feedback after 30 minutes or one day, or no feedback. After a week, subjects identified their original responses to three types of test items. One-day delay of feedback gave better results than the 30-minute delay. (GDC)
Descriptors: Cognitive Processes, Feedback, Foreign Countries, Grade 5