Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 2 |
| Since 2017 (last 10 years) | 3 |
| Since 2007 (last 20 years) | 10 |
Descriptor
| Multiple Choice Tests | 25 |
| Performance Based Assessment | 25 |
| Test Items | 25 |
| Test Construction | 10 |
| Educational Assessment | 7 |
| Mathematics Tests | 6 |
| Student Evaluation | 6 |
| Constructed Response | 5 |
| Scoring | 5 |
| Difficulty Level | 4 |
| Foreign Countries | 4 |
| More ▼ | |
Source
Author
Publication Type
Education Level
Audience
| Practitioners | 2 |
| Administrators | 1 |
| Teachers | 1 |
Location
| Germany | 1 |
| Ghana | 1 |
| Kuwait | 1 |
| North Dakota | 1 |
| Taiwan | 1 |
| Washington | 1 |
Laws, Policies, & Programs
Assessments and Surveys
| Comprehensive Tests of Basic… | 1 |
| General Educational… | 1 |
| Iowa Tests of Educational… | 1 |
| National Assessment of… | 1 |
| United States Medical… | 1 |
What Works Clearinghouse Rating
Ato Kwamina Arhin – Acta Educationis Generalis, 2024
Introduction: This article aimed at digging deep into distractors used for mathematics multiple-choice items. The quality of distractors may be more important than their number and the stem in a multiple-choice question. Little attention is given to this aspect of item writing especially, mathematics multiple-choice questions. This article…
Descriptors: Testing, Multiple Choice Tests, Test Items, Mathematics Tests
Yi-Chun Chen; Hsin-Kai Wu; Ching-Ting Hsin – Research in Science & Technological Education, 2024
Background and Purpose: As a growing number of instructional units have been developed to promote young children's scientific and engineering practices (SEPs), understanding how to evaluate and assess children's SEPs is imperative. However, paper-and-pencil assessments would not be suitable for young children because of their limited reading and…
Descriptors: Science Education, Engineering Education, Elementary School Students, Middle School Students
Cerchiara, Jack A.; Kim, Kerry J.; Meir, Eli; Wenderoth, Mary Pat; Doherty, Jennifer H. – Advances in Physiology Education, 2019
The basis for understanding neurophysiology is understanding ion movement across cell membranes. Students in introductory courses recognize ion concentration gradients as a driving force for ion movement but struggle to simultaneously account for electrical charge gradients. We developed a 17-multiple-choice item assessment of students'…
Descriptors: Introductory Courses, Neurology, Physiology, Cytology
Ali, Syed Haris; Carr, Patrick A.; Ruit, Kenneth G. – Journal of the Scholarship of Teaching and Learning, 2016
Plausible distractors are important for accurate measurement of knowledge via multiple-choice questions (MCQs). This study demonstrates the impact of higher distractor functioning on validity and reliability of scores obtained on MCQs. Freeresponse (FR) and MCQ versions of a neurohistology practice exam were given to four cohorts of Year 1 medical…
Descriptors: Scores, Multiple Choice Tests, Test Reliability, Test Validity
Kahraman, Nilüfer – Eurasian Journal of Educational Research, 2014
Problem: Practitioners working with multiple-choice tests have long utilized Item Response Theory (IRT) models to evaluate the performance of test items for quality assurance. The use of similar applications for performance tests, however, is often encumbered due to the challenges encountered in working with complicated data sets in which local…
Descriptors: Item Response Theory, Licensing Examinations (Professions), Performance Based Assessment, Computer Simulation
GED Testing Service, 2016
This guide is designed to help adult educators and administrators better understand the content of the GED® test. This guide is tailored to each test subject and highlights the test's item types, assessment targets, and guidelines for how items will be scored. This 2016 edition has been updated to include the most recent information about the…
Descriptors: Guidelines, Teaching Guides, High School Equivalency Programs, Test Items
Ollennu, Sam Nii Nmai; Etsey, Y. K. A. – Universal Journal of Educational Research, 2015
The study investigated the impact of item position in multiple-choice test on student performance at the Basic Education Certificate Examination (BECE) level in Ghana. The sample consisted of 810 Junior Secondary School (JSS) Form 3 students selected from 12 different schools. A quasi-experimental design was used. The instrument for the project…
Descriptors: Multiple Choice Tests, Test Items, Performance Based Assessment, Secondary School Students
Laprise, Shari L. – College Teaching, 2012
Successful exam composition can be a difficult task. Exams should not only assess student comprehension, but be learning tools in and of themselves. In a biotechnology course delivered to nonmajors at a business college, objective multiple-choice test questions often require students to choose the exception or "not true" choice. Anecdotal student…
Descriptors: Feedback (Response), Test Items, Multiple Choice Tests, Biotechnology
Sparfeldt, Jorn R.; Kimmel, Rumena; Lowenkamp, Lena; Steingraber, Antje; Rost, Detlef H. – Educational Assessment, 2012
Multiple-choice (MC) reading comprehension test items comprise three components: text passage, questions about the text, and MC answers. The construct validity of this format has been repeatedly criticized. In three between-subjects experiments, fourth graders (N[subscript 1] = 230, N[subscript 2] = 340, N[subscript 3] = 194) worked on three…
Descriptors: Test Items, Reading Comprehension, Construct Validity, Grade 4
Badgett, John L.; Christmann, Edwin P. – Corwin, 2009
While today's curriculum is largely driven by standards, many teachers find the lack of specificity in the standards to be confounding and even intimidating. Now this practical book provides middle and high school teachers with explicit guidance on designing specific objectives and developing appropriate formative and summative assessments to…
Descriptors: Test Items, Student Evaluation, Knowledge Level, National Standards
Peer reviewedFriedman, Stephen J. – Journal of Educational Measurement, 1999
This volume describes the characteristics and functions of test items, presents editorial guidelines for writing test items, presents methods for determining the quality of test items, and presents a compendium of important issues about test items. (SLD)
Descriptors: Constructed Response, Criteria, Evaluation Methods, Multiple Choice Tests
Zwick, Rebecca; And Others – 1993
Although the belief has been expressed that performance assessments are intrinsically more fair than multiple-choice measures, some forms of performance assessment may in fact be more likely than conventional tests to tap construct-irrelevant factors. As performance assessment grows in popularity, it will be increasingly important to monitor the…
Descriptors: Educational Assessment, Item Bias, Multiple Choice Tests, Performance Based Assessment
Pearson, P. David; Garavaglia, Diane R. – 2003
This paper first provides a summary and overview of what is already known and what is needed to learn about item types for future assessments by the National Assessment of Educational Progress (NAEP). In essence, the question addressed is whether constructed response items provide more information about what students are capable of doing than that…
Descriptors: Constructed Response, Elementary Secondary Education, Multiple Choice Tests, National Surveys
Peer reviewedShymansky, James A.; Chidsey, Jennifer L.; Henriques, Laura; Enger, Sandra; Yore, Larry D.; Wolfe, Edward W.; Jorgensen, Margaret – School Science and Mathematics, 1997
Describes the design of four science-performance tasks for grade 9 students and the relationship between their performance on those tasks and multiple-choice items on the Iowa Tests of Educational Development. The students and schools used to develop the tasks were not included in the verification sample. Contains 22 references. (Author/ASK)
Descriptors: Academic Achievement, Grade 9, High Schools, Multiple Choice Tests
Bennett, Randy Elliot; And Others – 1990
A framework for categorizing constructed-response items was developed in which items were ordered on a continuum from multiple-choice to presentation/performance according to the degree of constraint placed on the examinee's response. Two investigations were carried out to evaluate the validity of this framework. In the first investigation, 27…
Descriptors: Classification, Constructed Response, Models, Multiple Choice Tests
Previous Page | Next Page »
Pages: 1 | 2
Direct link
