NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20260
Since 20250
Since 2022 (last 5 years)0
Since 2017 (last 10 years)1
Since 2007 (last 20 years)2
Education Level
Higher Education1
Audience
Researchers1
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing all 13 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Bramley, Tom; Crisp, Victoria – Assessment in Education: Principles, Policy & Practice, 2019
For many years, question choice has been used in some UK public examinations, with students free to choose which questions they answer from a selection (within certain parameters). There has been little published research on choice of exam questions in recent years in the UK. In this article we distinguish different scenarios in which choice…
Descriptors: Test Items, Test Construction, Difficulty Level, Foreign Countries
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Knell, Janie L.; Wilhoite, Andrea P.; Fugate, Joshua Z.; González-Espada, Wilson J. – Electronic Journal of Science Education, 2015
Current science education reform efforts emphasize teaching K-12 science using hands-on, inquiry activities. For maximum learning and probability of implementation among inservice teachers, these strategies must be modeled in college science courses for preservice teachers. About a decade ago, Morehead State University revised their science…
Descriptors: Item Response Theory, Multiple Choice Tests, Test Construction, Psychometrics
Lutkus, Anthony D.; Laskaris, George – 1981
Analyses of student responses to Introductory Psychology test questions were discussed. The publisher supplied a two thousand item test bank on computer tape. Instructors selected questions for fifteen item tests. The test questions were labeled by the publisher as factual or conceptual. The semester course used a mastery learning format in which…
Descriptors: Difficulty Level, Higher Education, Item Analysis, Item Banks
Green, Kathy E. – 1983
The purpose of this study was to determine whether item difficulty is significantly affected by language difficulty and response set convergence. Language difficulty was varied by increasing sentence (stem) length, increasing syntactic complexity, and substituting uncommon words for more familiar terms in the item stem. Item wording ranged from…
Descriptors: Difficulty Level, Foreign Countries, Higher Education, Item Analysis
Peer reviewed Peer reviewed
Kolstad, Rosemarie K.; And Others – Journal of Research and Development in Education, 1983
A study compared college students' performance on complex multiple-choice tests with scores on multiple true-false clusters. Researchers concluded that the multiple-choice tests did not accurately measure students' knowledge and that cueing and guessing led to grade inflation. (PP)
Descriptors: Achievement Tests, Difficulty Level, Guessing (Tests), Higher Education
Livingston, Samuel A. – 1986
This paper deals with test fairness regarding a test consisting of two parts: (1) a "common" section, taken by all students; and (2) a "variable" section, in which some students may answer a different set of questions from other students. For example, a test taken by several thousand students each year contains a common multiple-choice portion and…
Descriptors: Difficulty Level, Error of Measurement, Essay Tests, Mathematical Models
Oosterhof, Albert C.; Coats, Pamela K. – 1981
Instructors who develop classroom examinations that require students to provide a numerical response to a mathematical problem are often very concerned about the appropriateness of the multiple-choice format. The present study augments previous research relevant to this concern by comparing the difficulty and reliability of multiple-choice and…
Descriptors: Comparative Analysis, Difficulty Level, Grading, Higher Education
Burton, Nancy W.; And Others – 1976
Assessment exercises (items) in three different formats--multiple-choice with an "I don't know" (IDK) option, multiple-choice without the IDK, and open-ended--were placed at the beginning, middle and end of 45-minute assessment packages (instruments). A balanced incomplete blocks analysis of variance was computed to determine the biasing…
Descriptors: Age Differences, Difficulty Level, Educational Assessment, Guessing (Tests)
Peer reviewed Peer reviewed
Smith, Malbert, III; And Others – Journal of Educational Measurement, 1979
Results of multiple-choice tests in educational psychology were examined to discover the effects on students' scores of changing their original answer choices after reconsideration. Eighty-six percent of the students changed one or more answers, and six out of seven students who made changes improved their scores by doing so. (Author/CTM)
Descriptors: Academic Ability, Difficulty Level, Error Patterns, Guessing (Tests)
Peer reviewed Peer reviewed
Israel, Glenn D.; Taylor, C. L. – Evaluation and Program Planning, 1990
Mail questionnaire items that are susceptible to order effects were examined using data from 168 questionnaires in a Florida Cooperative Extension Service evaluation. Order effects were found for multiple-response and attributive questions but not for single-response items. Order also interacted with question complexity, social desirability, and…
Descriptors: Adult Farmer Education, Difficulty Level, Educational Assessment, Error of Measurement
van Roosmalen, Willem M. M. – 1983
The construction of objective tests for native language reading comprehension is described. The tests were designed for the early secondary school years in several kinds of schools, vocational and non-vocational. The description focuses on the use of the Rasch model in test development, to develop a large pool of homogenous items and establish…
Descriptors: Ability Grouping, Difficulty Level, Foreign Countries, Item Banks
Oaster, T. R. F.; And Others – 1986
This study hypothesized that items in the one-question-per-passage format would be less easily answered when administered without their associated contexts than conventional reading comprehension items. A total of 256 seventh and eighth grade students were administered both Forms 3A and 3B of the Sequential Tests of Educational Progress (STEP 11).…
Descriptors: Context Effect, Difficulty Level, Grade 7, Grade 8
van Weeren, J., Ed. – 1983
Presented in this symposium reader are nine papers, four of which deal with the theory and impact of the Rasch model on language testing and five of which discuss final examinations in secondary schools in both general and specific terms. The papers are: "Introduction to Rasch Measurement: Some Implications for Language Testing" (J. J.…
Descriptors: Adolescents, Comparative Analysis, Comparative Education, Difficulty Level