Publication Date
| In 2026 | 0 |
| Since 2025 | 13 |
| Since 2022 (last 5 years) | 56 |
| Since 2017 (last 10 years) | 129 |
| Since 2007 (last 20 years) | 180 |
Descriptor
| Computer Assisted Testing | 233 |
| Multiple Choice Tests | 233 |
| Foreign Countries | 83 |
| Test Items | 72 |
| Test Format | 49 |
| College Students | 48 |
| Test Construction | 47 |
| Scores | 43 |
| Higher Education | 38 |
| Comparative Analysis | 37 |
| Student Evaluation | 37 |
| More ▼ | |
Source
Author
| Anderson, Paul S. | 6 |
| Ben Seipel | 3 |
| Bridgeman, Brent | 3 |
| Clariana, Roy B. | 3 |
| Mark L. Davison | 3 |
| Patrick C. Kennedy | 3 |
| Sarah E. Carlson | 3 |
| Vidal-Abarca, Eduardo | 3 |
| Virginia Clinton-Lisell | 3 |
| Wise, Steven L. | 3 |
| Aldabe, Itziar | 2 |
| More ▼ | |
Publication Type
Education Level
| Higher Education | 104 |
| Postsecondary Education | 88 |
| Secondary Education | 30 |
| Elementary Education | 17 |
| High Schools | 10 |
| Middle Schools | 10 |
| Elementary Secondary Education | 9 |
| Junior High Schools | 9 |
| Grade 4 | 5 |
| Grade 8 | 4 |
| Grade 11 | 3 |
| More ▼ | |
Audience
| Researchers | 2 |
Location
| United Kingdom | 11 |
| Turkey | 9 |
| Canada | 7 |
| Australia | 6 |
| Germany | 5 |
| Taiwan | 4 |
| Europe | 3 |
| Indonesia | 3 |
| South Africa | 3 |
| Spain | 3 |
| Texas | 3 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Victoria Crisp; Sylvia Vitello; Abdullah Ali Khan; Heather Mahy; Sarah Hughes – Research Matters, 2025
This research set out to enhance our understanding of the exam techniques and types of written annotations or markings that learners may wish to use to support their thinking when taking digital multiple-choice exams. Additionally, we aimed to further explore issues around the factors that contribute to learners writing less rough work and…
Descriptors: Computer Assisted Testing, Test Format, Multiple Choice Tests, Notetaking
Andreea Dutulescu; Stefan Ruseti; Denis Iorga; Mihai Dascalu; Danielle S. McNamara – Grantee Submission, 2025
Automated multiple-choice question (MCQ) generation is valuable for scalable assessment and enhanced learning experiences. How-ever, existing MCQ generation methods face challenges in ensuring plausible distractors and maintaining answer consistency. This paper intro-duces a method for MCQ generation that integrates reasoning-based explanations…
Descriptors: Automation, Computer Assisted Testing, Multiple Choice Tests, Natural Language Processing
Archana Praveen Kumar; Ashalatha Nayak; Manjula Shenoy K.; Chaitanya; Kaustav Ghosh – International Journal of Artificial Intelligence in Education, 2024
Multiple Choice Questions (MCQs) are a popular assessment method because they enable automated evaluation, flexible administration and use with huge groups. Despite these benefits, the manual construction of MCQs is challenging, time-consuming and error-prone. This is because each MCQ is comprised of a question called the "stem", a…
Descriptors: Multiple Choice Tests, Test Construction, Test Items, Semantics
Ersan, Ozge; Berry, Yufeng – Educational Measurement: Issues and Practice, 2023
The increasing use of computerization in the testing industry and the need for items potentially measuring higher-order skills have led educational measurement communities to develop technology-enhanced (TE) items and conduct validity studies on the use of TE items. Parallel to this goal, the purpose of this study was to collect validity evidence…
Descriptors: Computer Assisted Testing, Multiple Choice Tests, Elementary Secondary Education, Accountability
Yusuf Oc; Hela Hassen – Marketing Education Review, 2025
Driven by technological innovations, continuous digital expansion has transformed fundamentally the landscape of modern higher education, leading to discussions about evaluation techniques. The emergence of generative artificial intelligence raises questions about reliability and academic honesty regarding multiple-choice assessments in online…
Descriptors: Higher Education, Multiple Choice Tests, Computer Assisted Testing, Electronic Learning
Nico Willert; Jonathan Thiemann – Technology, Knowledge and Learning, 2024
Manual composition of tasks and exams is a challenging and time-consuming task. Especially when exams are taken remotely without the personal monitoring by examiners, most exams can easily lose their integrity with the use of previously done exercises or student communication. This research introduces an approach that incorporates the principles…
Descriptors: Tests, Examiners, Foreign Countries, Multiple Choice Tests
Anela Hrnjicic; Adis Alihodžic – International Electronic Journal of Mathematics Education, 2024
Understanding the concepts related to real function is essential in learning mathematics. To determine how students understand these concepts, it is necessary to have an appropriate measurement tool. In this paper, we have created a web application using 32 items from conceptual understanding of real functions (CURF) item bank. We conducted a…
Descriptors: Mathematical Concepts, College Freshmen, Foreign Countries, Computer Assisted Testing
Andrei Ludu; Maria Ludu; Teha Cooks – Journal of Computers in Mathematics and Science Teaching, 2025
This paper presents research activity on computer-based mathematics learning to study the effectiveness of open-source teaching computer platforms (Canvas) in computer-assisted instruction. We designed a set of multiple-choice online quizzes as a dynamical flow-chart of possible paths to follow while solving a difficult math problem on…
Descriptors: Teaching Methods, Computer Assisted Instruction, Mathematics Education, Engineering Education
Lynne N. Kennette; Dawn McGuckin – Teaching & Learning Inquiry, 2025
Multiple choice tests are unlikely to disappear from formal education, partly due to the ease of large-scale administration and grading and their similarity to licensing exams in various fields (e.g., nursing). Despite post-secondary instructors' best intentions in giving students adequate time to complete multiple choice assessments, it can be…
Descriptors: Foreign Countries, Two Year College Students, Multiple Choice Tests, Computer Assisted Testing
Ben Seipel; Patrick C. Kennedy; Sarah E. Carlson; Virginia Clinton-Lisell; Mark L. Davison – Journal of Learning Disabilities, 2023
As access to higher education increases, it is important to monitor students with special needs to facilitate the provision of appropriate resources and support. Although metrics such as the "reading readiness" ACT (formerly American College Testing) of provide insight into how many students may need such resources, they do not specify…
Descriptors: Multiple Choice Tests, Computer Assisted Testing, Reading Tests, Reading Comprehension
Lae Lae Shwe; Sureena Matayong; Suntorn Witosurapot – Education and Information Technologies, 2024
Multiple Choice Questions (MCQs) are an important evaluation technique for both examinations and learning activities. However, the manual creation of questions is time-consuming and challenging for teachers. Hence, there is a notable demand for an Automatic Question Generation (AQG) system. Several systems have been created for this aim, but the…
Descriptors: Difficulty Level, Computer Assisted Testing, Adaptive Testing, Multiple Choice Tests
Plasencia, Javier – Biochemistry and Molecular Biology Education, 2023
Multiple studies have shown that testing contributes to learning at all educational levels. In this observational classroom study, we report the use of a learning tool developed for a Genetics and Molecular Biology course at the college level. An interactive set of practice exams that included 136 multiple choice questions (MCQ) or matching…
Descriptors: Molecular Biology, Genetics, Science Tests, College Science
Richard Say; Denis Visentin; Annette Saunders; Iain Atherton; Andrea Carr; Carolyn King – Journal of Computer Assisted Learning, 2024
Background: Formative online multiple-choice tests are ubiquitous in higher education and potentially powerful learning tools. However, commonly used feedback approaches in online multiple-choice tests can discourage meaningful engagement and enable strategies, such as trial-and-error, that circumvent intended learning outcomes. These strategies…
Descriptors: Feedback (Response), Self Management, Formative Evaluation, Multiple Choice Tests
Falcão, Filipe; Costa, Patrício; Pêgo, José M. – Advances in Health Sciences Education, 2022
Background: Current demand for multiple-choice questions (MCQs) in medical assessment is greater than the supply. Consequently, an urgency for new item development methods arises. Automatic Item Generation (AIG) promises to overcome this burden, generating calibrated items based on the work of computer algorithms. Despite the promising scenario,…
Descriptors: Multiple Choice Tests, Computer Assisted Testing, Test Items, Medical Education
Wood, Eileen; Klausz, Noah; MacNeil, Stephen – Innovative Higher Education, 2022
Learning gains associated with multiple-choice testing formats that provide immediate feedback (e.g., IFAT®) are often greater than those for typical single-choice delayed feedback formats (e.g. Scantron®). Immediate feedback formats also typically permit part marks unlike delayed feedback formats. The present study contrasted IFAT® with a new…
Descriptors: Academic Achievement, Computer Assisted Testing, Feedback (Response), Organic Chemistry

Peer reviewed
Direct link
