NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 21 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
McGuire, Michael J. – International Journal for the Scholarship of Teaching and Learning, 2023
College students in a lower-division psychology course made metacognitive judgments by predicting and postdicting performance for true-false, multiple-choice, and fill-in-the-blank question sets on each of three exams. This study investigated which question format would result in the most accurate metacognitive judgments. Extending Koriat's (1997)…
Descriptors: Metacognition, Multiple Choice Tests, Accuracy, Test Format
Peer reviewed Peer reviewed
Direct linkDirect link
van den Broek, Gesa S. E.; Gerritsen, Suzanne L.; Oomen, Iris T. J.; Velthoven, Eva; van Boxtel, Femke H. J.; Kester, Liesbeth; van Gog, Tamara – Journal of Educational Psychology, 2023
Multiple-choice questions (MCQs) are popular in vocabulary software because they can be scored automatically and are compatible with many input devices (e.g., touchscreens). Answering MCQs is beneficial for learning, especially when learners retrieve knowledge from memory to evaluate plausible answer alternatives. However, such retrieval may not…
Descriptors: Multiple Choice Tests, Vocabulary Development, Test Format, Cues
Peer reviewed Peer reviewed
Direct linkDirect link
Roger Young; Emily Courtney; Alexander Kah; Mariah Wilkerson; Yi-Hsin Chen – Teaching of Psychology, 2025
Background: Multiple-choice item (MCI) assessments are burdensome for instructors to develop. Artificial intelligence (AI, e.g., ChatGPT) can streamline the process without sacrificing quality. The quality of AI-generated MCIs and human experts is comparable. However, whether the quality of AI-generated MCIs is equally good across various domain-…
Descriptors: Item Response Theory, Multiple Choice Tests, Psychology, Textbooks
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Lee, Abby Deng-Huei – English Language Teaching, 2018
To evaluate the sensitivity of multiple-choice cloze (MCC) tests that use different types of items--syntactic, semantic, and connective--to assess reading ability, 170 English as a foreign language (EFL) students in a vocational college in Taiwan were recruited. The students were divided into two groups (level A and level B) based on their scores…
Descriptors: Foreign Countries, Multiple Choice Tests, Cloze Procedure, Test Items
Craig Pournara; Lynn Bowie – South African Journal of Childhood Education, 2023
Background: Poor mathematics performance in South Africa is well known. The COVID-19 pandemic was expected to exacerbate the situation. Aim: To investigate Grade 7 learners' mathematical knowledge at the end of primary school and to compare mathematical performance of Grade 7 and 8 learners in the context of the pandemic. Setting: Data were…
Descriptors: Mathematics Education, Knowledge Level, Grade 7, COVID-19
Hildenbrand, Lena; Wiley, Jennifer – Grantee Submission, 2021
Many studies have demonstrated that testing students on to-be-learned materials can be an effective learning activity. However, past studies have also shown that some practice test formats are more effective than others. Open-ended recall or short answer practice tests may be effective because the questions prompt deeper processing as students…
Descriptors: Test Format, Outcomes of Education, Cognitive Processes, Learning Activities
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Maass, Jaclyn K.; Pavlik, Philip I., Jr. – International Educational Data Mining Society, 2016
This research combines work in memory, retrieval practice, and depth of processing research. This work aims to identify how the format and depth of a retrieval practice item can be manipulated to increase the effort required to successfully recall or formulate an answer, with the hypothesis that if the effort required to answer an item is…
Descriptors: Memory, Test Format, Cues, Cognitive Processes
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Ibbett, Nicole L.; Wheldon, Brett J. – e-Journal of Business Education and Scholarship of Teaching, 2016
In 2014 Central Queensland University (CQU) in Australia banned the use of multiple choice questions (MCQs) as an assessment tool. One of the reasons given for this decision was that MCQs provide an opportunity for students to "pass" by merely guessing their answers. The mathematical likelihood of a student passing by guessing alone can…
Descriptors: Foreign Countries, Multiple Choice Tests, Item Banks, Guessing (Tests)
Peer reviewed Peer reviewed
Direct linkDirect link
Koretsky, Milo D.; Brooks, Bill J.; White, Rachel M.; Bowen, Alec S. – Journal of Engineering Education, 2016
Background: We investigated student responses to multiple-choice concept questions during active learning activities where students write justifications for their answer choices. Purpose: We selected two questions that asked students to apply the same concept in the same way but that have different surface features. We characterized students'…
Descriptors: Active Learning, Responses, Multiple Choice Tests, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Sun, Bo; Zhu, Yunzong; Xiao, Yongkang; Xiao, Rong; Wei, Yungang – IEEE Transactions on Learning Technologies, 2019
In recent years, computerized adaptive testing (CAT) has gained popularity as an important means to evaluate students' ability. Assigning tags to test questions is crucial in CAT. Manual tagging is widely used for constructing question banks; however, this approach is time-consuming and might lead to consistency issues. Automatic question tagging,…
Descriptors: Computer Assisted Testing, Student Evaluation, Test Items, Multiple Choice Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Stylianou-Georgiou, Agni; Papanastasiou, Elena C. – Educational Research and Evaluation, 2017
The purpose of our study was to examine the issue of answer changing in relation to students' abilities to monitor their behaviour accurately while responding to multiple-choice tests. The data for this study were obtained from the final examination administered to students in an educational psychology course. The results of the study indicate…
Descriptors: Role, Metacognition, Testing, Multiple Choice Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Zhang, Hanmu; Zhang, Hanmu – Journal of Education and Learning, 2019
Since understanding reading assignments is important to succeeding in school, improving the way that text is arranged in books would be an efficient way to help students better understand the material and perform well on tests. In this study, we asked students to read two original and two rearranged historical passages, in which rephrased…
Descriptors: Test Items, Textbook Preparation, Retention (Psychology), Recall (Psychology)
Peer reviewed Peer reviewed
Direct linkDirect link
Davis, Doris Bitler – Teaching of Psychology, 2017
Providing two or more versions of multiple-choice exams has long been a popular strategy for reducing the opportunity for students to engage in academic dishonesty. While the results of studies comparing exam scores under different question-order conditions have been inconclusive, the potential importance of contextual cues to aid student recall…
Descriptors: Test Construction, Multiple Choice Tests, Sequential Approach, Cues
Peer reviewed Peer reviewed
Albanese, Mark A. – Educational Measurement: Issues and Practice, 1993
A comprehensive review is given of evidence, with a bearing on the recommendation to avoid use of complex multiple choice (CMC) items. Avoiding Type K items (four primary responses and five secondary choices) seems warranted, but evidence against CMC in general is less clear. (SLD)
Descriptors: Cues, Difficulty Level, Multiple Choice Tests, Responses
Peer reviewed Peer reviewed
Strang, Harold R. – Journal of Educational Measurement, 1977
The effects of option familiarity, length, and technicality on guessing or multiple choice items were investigated in two experiments. Generally, these college undergraduates tended to favor familiar, non-technical, and longer options when guessing on multiple choice tests. (JKS)
Descriptors: Cues, Females, Guessing (Tests), Higher Education
Previous Page | Next Page ยป
Pages: 1  |  2