Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 2 |
| Since 2017 (last 10 years) | 7 |
| Since 2007 (last 20 years) | 13 |
Descriptor
Source
Author
| Bátor, Judit | 1 |
| Domyancich, John M. | 1 |
| Dossey, John A. | 1 |
| Everson, Howard | 1 |
| Gorbunova, Tatiana N. | 1 |
| Guštin, Andrej | 1 |
| Haladyna, Thomas | 1 |
| Haladyna, Thomas M. | 1 |
| Hargan, Noleen | 1 |
| Leuba, Richard J. | 1 |
| Lim, Kien H. | 1 |
| More ▼ | |
Publication Type
| Reports - Descriptive | 23 |
| Journal Articles | 16 |
| Reports - Research | 2 |
| Collected Works - Proceedings | 1 |
| Numerical/Quantitative Data | 1 |
| Tests/Questionnaires | 1 |
Education Level
| Higher Education | 8 |
| Postsecondary Education | 6 |
| Secondary Education | 2 |
| Elementary Education | 1 |
| Elementary Secondary Education | 1 |
| Grade 6 | 1 |
| High Schools | 1 |
| Junior High Schools | 1 |
| Middle Schools | 1 |
Audience
| Teachers | 3 |
| Practitioners | 2 |
| Researchers | 1 |
| Students | 1 |
Location
| Hungary | 1 |
| Russia | 1 |
| United Kingdom (Great Britain) | 1 |
Laws, Policies, & Programs
| No Child Left Behind Act 2001 | 1 |
Assessments and Surveys
| National Assessment of… | 1 |
| Program for International… | 1 |
What Works Clearinghouse Rating
Stevens, Scott P.; Palocsay, Susan W.; Novoa, Luis J. – INFORMS Transactions on Education, 2023
Test writing is a fundamental component of teaching. With increasing pressure to teach larger groups of students, conduct formal assessment of learning outcomes, and offer online and hybrid classes, there is a need for alternatives to constructed response problem-solving test questions. We believe that appropriate use of multiple-choice (MC)…
Descriptors: Multiple Choice Tests, Introductory Courses, Test Construction, Content Validity
Haladyna, Thomas – International Journal of Assessment Tools in Education, 2022
The use of multiple-choice items for classroom testing is firmly established for many good reasons. The content any unit or course of study can be well sampled. Test scores can be reliable (trusted). And time spent administering and scoring can be minimized. This article provides a current review of best practices in the design and use of a…
Descriptors: Multiple Choice Tests, Testing, Student Evaluation, Recall (Psychology)
Bátor, Judit; Szeberényi, József – Biochemistry and Molecular Biology Education, 2021
Problem solving, multiple-choice question-based educational tools have been used for decades in molecular cell biology courses at the University of Pécs Medical School, Pécs, Hungary. A set of these tests was published in Biochemistry and Molecular Biology Education between 2002 and 2015. Such tests using an experimental approach help students to…
Descriptors: Problem Solving, COVID-19, Pandemics, Multiple Choice Tests
Pelanek, Radek – IEEE Transactions on Learning Technologies, 2020
Learning systems can utilize many practice exercises, ranging from simple multiple-choice questions to complex problem-solving activities. In this article, we propose a classification framework for such exercises. The framework classifies exercises in three main aspects: (1) the primary type of interaction; (2) the presentation mode; and (3) the…
Descriptors: Integrated Learning Systems, Classification, Multiple Choice Tests, Problem Solving
Rovšek, Barbara; Guštin, Andrej – Physics Education, 2018
An astronomy "experiment" composed of three parts is described in the article. Being given necessary data a simple model of inner planets of the solar system is made in the first part with planets' circular orbits using appropriate scale. In the second part revolution of the figurines used as model representations of the planets along…
Descriptors: Motion, Scientific Concepts, Scientific Principles, Science Activities
Haladyna, Thomas M. – IDEA Center, Inc., 2018
Writing multiple-choice test items to measure student learning in higher education is a challenge. Based on extensive scholarly research and experience, the author describes various item formats, offers guidelines for creating these items, and provides many examples of both good and bad test items. He also suggests some shortcuts for developing…
Descriptors: Test Construction, Multiple Choice Tests, Test Items, Higher Education
Gorbunova, Tatiana N. – European Journal of Contemporary Education, 2017
The subject of the research is to build methodologies to evaluate the student knowledge by testing. The author points to the importance of feedback about the mastering level in the learning process. Testing is considered as a tool. The object of the study is to create the test system models for defence practice problems. Special attention is paid…
Descriptors: Testing, Evaluation Methods, Feedback (Response), Simulation
Shaw, David D.; Pease, Leonard F., III. – Chemical Engineering Education, 2014
Grading can be accelerated to make time for more effective instruction. This article presents specific time management strategies selected to decrease administrative time required of faculty and teaching assistants, including a multiple answer multiple choice interface for exams, a three-tier grading system for open ended problem solving, and a…
Descriptors: Science Instruction, Grading, Time Management, Teachers
Domyancich, John M. – Journal of Chemical Education, 2014
Multiple-choice questions are an important part of large-scale summative assessments, such as the advanced placement (AP) chemistry exam. However, past AP chemistry exam items often lacked the ability to test conceptual understanding and higher-order cognitive skills. The redesigned AP chemistry exam shows a distinctive shift in item types toward…
Descriptors: Multiple Choice Tests, Science Instruction, Chemistry, Summative Evaluation
Lim, Kien H. – Mathematics Teaching in the Middle School, 2014
Student errors are springboards for analyzing, reasoning, and justifying. The mathematics education community recognizes the value of student errors, noting that "mistakes are seen not as dead ends but rather as potential avenues for learning." To induce specific errors and help students learn, choose tasks that might produce mistakes.…
Descriptors: Secondary School Mathematics, Middle School Students, Error Patterns, Error Correction
Rothman, Robert – Alliance for Excellent Education, 2011
New assessments that measure deeper learning--whether students understand challenging content and are able to apply that knowledge to think critically, solve problems, communicate their understanding, and work with their peers--are essential if students are to develop the competencies they need to succeed in an increasingly complex world. Such…
Descriptors: Scores, Evaluation, Performance Based Assessment, Multiple Choice Tests
Szeberenyi, Jozsef – Biochemistry and Molecular Biology Education, 2010
This paper presents a problem-solving test that deals with the regulation of the "trp" operon of "Escherichia coli." Two mutants of this operon are described: in mutant A, the operator region of the operon carries a point mutation so that it is unable to carry out its function; mutant B expresses a "trp" repressor protein unable to bind…
Descriptors: Problem Solving, Genetics, Microbiology, Science Tests
Tucker, Bill – Educational Leadership, 2009
New technology-enabled assessments offer the potential to understand more than just whether a student answered a test question right or wrong. Using multiple forms of media that enable both visual and graphical representations, these assessments present complex, multistep problems for students to solve and collect detailed information about an…
Descriptors: Research and Development, Problem Solving, Student Characteristics, Information Technology
Peer reviewedNorris, Stephen P. – Educational Researcher, 1989
Discusses the generalizability of critical thinking and the disposition to think critically. Argues that verbal reports of examinees' thinking on multiple-choice tests can explain the reasoning behind their answers and, thus, can be used to assess the inability to make credibility judgments. (FMW)
Descriptors: Credibility, Critical Thinking, Elementary Secondary Education, Evaluation
Leuba, Richard J. – Engineering Education, 1986
Explains how multiple choice test items can be devised to measure higher-order learning, including engineering problem solving. Discusses the value and information provided in item analysis procedures with machine-scored tests. Suggests elements to consider in test design. (ML)
Descriptors: College Science, Creative Thinking, Engineering Education, Evaluation Methods
Previous Page | Next Page »
Pages: 1 | 2
Direct link
