NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
Assessments and Surveys
ACT Assessment1
What Works Clearinghouse Rating
Showing all 7 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
van den Broek, Gesa S. E.; Gerritsen, Suzanne L.; Oomen, Iris T. J.; Velthoven, Eva; van Boxtel, Femke H. J.; Kester, Liesbeth; van Gog, Tamara – Journal of Educational Psychology, 2023
Multiple-choice questions (MCQs) are popular in vocabulary software because they can be scored automatically and are compatible with many input devices (e.g., touchscreens). Answering MCQs is beneficial for learning, especially when learners retrieve knowledge from memory to evaluate plausible answer alternatives. However, such retrieval may not…
Descriptors: Multiple Choice Tests, Vocabulary Development, Test Format, Cues
Peer reviewed Peer reviewed
Direct linkDirect link
Roger Young; Emily Courtney; Alexander Kah; Mariah Wilkerson; Yi-Hsin Chen – Teaching of Psychology, 2025
Background: Multiple-choice item (MCI) assessments are burdensome for instructors to develop. Artificial intelligence (AI, e.g., ChatGPT) can streamline the process without sacrificing quality. The quality of AI-generated MCIs and human experts is comparable. However, whether the quality of AI-generated MCIs is equally good across various domain-…
Descriptors: Item Response Theory, Multiple Choice Tests, Psychology, Textbooks
Peer reviewed Peer reviewed
Direct linkDirect link
Labranche, Leah; Wilson, Timothy D.; Terrell, Mark; Kulesza, Randy J. – Anatomical Sciences Education, 2022
Three-dimensional (3D) digital anatomical models show potential to demonstrate complex anatomical relationships; however, the literature is inconsistent as to whether they are effective in improving the anatomy performance, particularly for students with low spatial visualization ability (Vz). This study investigated the educational effectiveness…
Descriptors: Spatial Ability, Computer Simulation, Anatomy, Visualization
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Maass, Jaclyn K.; Pavlik, Philip I., Jr. – International Educational Data Mining Society, 2016
This research combines work in memory, retrieval practice, and depth of processing research. This work aims to identify how the format and depth of a retrieval practice item can be manipulated to increase the effort required to successfully recall or formulate an answer, with the hypothesis that if the effort required to answer an item is…
Descriptors: Memory, Test Format, Cues, Cognitive Processes
Peer reviewed Peer reviewed
Direct linkDirect link
Yung, Hsin I.; Paas, Fred – Educational Technology & Society, 2015
This study investigated the effects of a pedagogical agent that cued relevant information in a story-based instructional animation on the cardiovascular system. Based on cognitive load theory, it was expected that the experimental condition with the pedagogical agent would facilitate students to distinguish between relevant and irrelevant…
Descriptors: Animation, Cues, Instructional Innovation, Information Literacy
Peer reviewed Peer reviewed
Albanese, Mark A. – Educational Measurement: Issues and Practice, 1993
A comprehensive review is given of evidence, with a bearing on the recommendation to avoid use of complex multiple choice (CMC) items. Avoiding Type K items (four primary responses and five secondary choices) seems warranted, but evidence against CMC in general is less clear. (SLD)
Descriptors: Cues, Difficulty Level, Multiple Choice Tests, Responses
Huntley, Renee M.; Plake, Barbara S. – 1980
Guidelines for test item-writing have traditionally recommended making the correct answer of a multiple-choice item grammatically consistent with its stem. To investigate the effects of adhering to this practice, certain item formats were designed to determine whether the practice of providing relevant grammatical clues, in itself, created cue…
Descriptors: College Entrance Examinations, Cues, Difficulty Level, Grammar