Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 4 |
Since 2006 (last 20 years) | 5 |
Descriptor
Test Format | 22 |
Higher Education | 12 |
Test Items | 11 |
Multiple Choice Tests | 10 |
College Students | 7 |
Difficulty Level | 7 |
Scores | 6 |
Test Construction | 5 |
Undergraduate Students | 5 |
Foreign Countries | 4 |
Test Validity | 4 |
More ▼ |
Source
Journal of Experimental… | 22 |
Author
Plake, Barbara S. | 2 |
Andre, Thomas | 1 |
Ansley, Timothy N. | 1 |
Arce-Ferrer, Alvaro J. | 1 |
Bandalos, Deborah L. | 1 |
Bulut, Okan | 1 |
Crehan, Kevin | 1 |
Crocker, Linda | 1 |
DiBattista, David | 1 |
Diedenhofen, Birk | 1 |
Eaves, Ronald C. | 1 |
More ▼ |
Publication Type
Journal Articles | 22 |
Reports - Research | 22 |
Tests/Questionnaires | 2 |
Education Level
Higher Education | 4 |
Postsecondary Education | 4 |
High Schools | 1 |
Secondary Education | 1 |
Audience
Location
Canada | 1 |
Germany | 1 |
Mexico | 1 |
Netherlands | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Hidden Figures Test | 1 |
Self Description Questionnaire | 1 |
State Trait Anxiety Inventory | 1 |
What Works Clearinghouse Rating
Spratto, Elisabeth M.; Bandalos, Deborah L. – Journal of Experimental Education, 2020
Research suggests that certain characteristics of survey items may impact participants' responses. In this study we investigated the impact of several of these characteristics: vague wording, question-versus-statement phrasing, and full-versus-partial labeling of response options. We manipulated survey items per these characteristics and randomly…
Descriptors: Attitude Measures, Test Format, Test Construction, Factor Analysis
Papenberg, Martin; Diedenhofen, Birk; Musch, Jochen – Journal of Experimental Education, 2021
Testwiseness may introduce construct-irrelevant variance to multiple-choice test scores. Presenting response options sequentially has been proposed as a potential solution to this problem. In an experimental validation, we determined the psychometric properties of a test based on the sequential presentation of response options. We created a strong…
Descriptors: Test Wiseness, Test Validity, Test Reliability, Multiple Choice Tests
Vispoel, Walter Peter; Morris, Carrie Ann; Sun, Linan – Journal of Experimental Education, 2019
In two independent studies of questionnaire administration, respondents completed multidimensional self-concept inventories within four randomized research conditions that mirrored the most common administration formats used in practice: paper booklets with and without answer sheets and computer questionnaires with single versus multiple items per…
Descriptors: Self Concept Measures, Computer Assisted Testing, Questionnaires, Psychometrics
Arce-Ferrer, Alvaro J.; Bulut, Okan – Journal of Experimental Education, 2019
This study investigated the performance of four widely used data-collection designs in detecting test-mode effects (i.e., computer-based versus paper-based testing). The experimental conditions included four data-collection designs, two test-administration modes, and the availability of an anchor assessment. The test-level and item-level results…
Descriptors: Data Collection, Test Construction, Test Format, Computer Assisted Testing
DiBattista, David; Sinnige-Egger, Jo-Anne; Fortuna, Glenda – Journal of Experimental Education, 2014
The authors assessed the effects of using "none of the above" as an option in a 40-item, general-knowledge multiple-choice test administered to undergraduate students. Examinees who selected "none of the above" were given an incentive to write the correct answer to the question posed. Using "none of the above" as the…
Descriptors: Multiple Choice Tests, Testing, Undergraduate Students, Test Items

Plake, Barbara S. – Journal of Experimental Education, 1980
Three-item orderings and two levels of knowledge of ordering were used to study differences in test results, student's perception of the test's fairness and difficulty, and student's estimation of test performance. No significant order effect was found. (Author/GK)
Descriptors: Difficulty Level, Higher Education, Scores, Test Format

Eaves, Ronald C.; Smith, Earl – Journal of Experimental Education, 1986
The effects of examination format and previous experience with microcomputers on the test scores of 96 undergraduate students were investigated. Results indicated no significant differences in the scores obtained on the two types of test administration (microcomputer and traditional paper and pencil). Computer experience was not an important…
Descriptors: College Students, Computer Assisted Testing, Educational Media, Higher Education

Crehan, Kevin; Haladyna, Thomas M. – Journal of Experimental Education, 1991
Two item-writing rules were tested: phrasing stems as questions versus partial sentences; and using the "none-of-the-above" option instead of a specific content option. Results with 228 college students do not support the use of either stem type and provide limited evidence to caution against the "none-of-the-above" option.…
Descriptors: College Students, Higher Education, Multiple Choice Tests, Test Construction

Katz, Barry M.; McSweeney, Maryellen – Journal of Experimental Education, 1984
This paper developed and illustrated a technique to analyze categorical data when subjects can appear in any number of categories for multigroup designs. Post hoc procedures to be used in conjunction with the presented statistical test are also developed. The technique is a large sample technique whose small sample properties are as yet unknown.…
Descriptors: Data Analysis, Hypothesis Testing, Mathematical Models, Research Methodology

Andre, Thomas – Journal of Experimental Education, 1990
Two experiments with 254 and 76 undergraduates, respectively, investigated the hypothesis that there would be an effect of questions concerning application of concepts only when there was an interval between study and test. Results confirm the hypothesis regarding application questions inserted in text read by undergraduates. (TJH)
Descriptors: Higher Education, Pretests Posttests, Questioning Techniques, Reading Tests

Weiten, Wayne – Journal of Experimental Education, 1982
A comparison of double as opposed to single multiple-choice questions yielded significant differences in regard to item difficulty, item discrimination, and internal reliability, but not concurrent validity. (Author/PN)
Descriptors: Difficulty Level, Educational Testing, Higher Education, Multiple Choice Tests

Green, Kathy – Journal of Experimental Education, 1979
Reliabilities and concurrent validities of teacher-made multiple-choice and true-false tests were compared. No significant differences were found even when multiple-choice reliability was adjusted to equate testing time. (Author/MH)
Descriptors: Comparative Testing, Higher Education, Multiple Choice Tests, Test Format

Schraw, Gregory – Journal of Experimental Education, 1997
The basis of students' confidence in their answers to test items was studied with 95 undergraduates. Results support the domain-general hypothesis that predicts that confidence judgments will be related to performance on a particular test and also to confidence judgments and performance on unrelated tests. (SLD)
Descriptors: Higher Education, Metacognition, Performance Factors, Scores

Pajares, Frank; Miller, M. David – Journal of Experimental Education, 1997
The mathematics self-efficacy and problem-solving performance of 327 middle school students were assessed with multiple-choice and open-ended methods. No differences in self-efficacy resulted from the different forms of assessment, although those who took the multiple-choice test had higher scores and better calibration of ability. (SLD)
Descriptors: Ability, Educational Assessment, Mathematics, Middle School Students

Threadgill-Sowder, Judith; And Others – Journal of Experimental Education, 1985
The purpose of this study was to explore the relationships of certain cognitive variables to problem-solving performance. Cognitive restructuring, spatial ability, reading comprehension, and mathmatical story problems tests presented in a regular verbiage, low verbiage, and drawn formats were given to students in grades three through seven.…
Descriptors: Cognitive Processes, Cognitive Tests, Elementary Education, Field Dependence Independence
Previous Page | Next Page ยป
Pages: 1 | 2