NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Showing all 9 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Andreea Dutulescu; Stefan Ruseti; Denis Iorga; Mihai Dascalu; Danielle S. McNamara – Grantee Submission, 2025
Automated multiple-choice question (MCQ) generation is valuable for scalable assessment and enhanced learning experiences. How-ever, existing MCQ generation methods face challenges in ensuring plausible distractors and maintaining answer consistency. This paper intro-duces a method for MCQ generation that integrates reasoning-based explanations…
Descriptors: Automation, Computer Assisted Testing, Multiple Choice Tests, Natural Language Processing
Peer reviewed Peer reviewed
Direct linkDirect link
Katrin Klingbeil; Fabian Rösken; Bärbel Barzel; Florian Schacht; Kaye Stacey; Vicki Steinle; Daniel Thurm – ZDM: Mathematics Education, 2024
Assessing students' (mis)conceptions is a challenging task for teachers as well as for researchers. While individual assessment, for example through interviews, can provide deep insights into students' thinking, this is very time-consuming and therefore not feasible for whole classes or even larger settings. For those settings, automatically…
Descriptors: Multiple Choice Tests, Formative Evaluation, Mathematics Tests, Misconceptions
Peer reviewed Peer reviewed
Direct linkDirect link
Tan, Kim Chwee Daniel; Taber, Keith S.; Liew, Yong Qiang; Teo, Kay Liang Alan – Chemistry Education Research and Practice, 2019
The internet is prevalent in society today, and user-friendly web-based productivity tools are readily available for developing diagnostic instruments. This study sought to determine the affordances of a web-based diagnostic instrument on ionisation energy (wIEDI) based on the pen-and-paper version, the Ionisation Energy Diagnostic Instrument…
Descriptors: Energy, Secondary School Science, Chemistry, Diagnostic Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Wise, Steven L. – Educational Measurement: Issues and Practice, 2017
The rise of computer-based testing has brought with it the capability to measure more aspects of a test event than simply the answers selected or constructed by the test taker. One behavior that has drawn much research interest is the time test takers spend responding to individual multiple-choice items. In particular, very short response…
Descriptors: Guessing (Tests), Multiple Choice Tests, Test Items, Reaction Time
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Kim, Kerry J.; Meir, Eli; Pope, Denise S.; Wendel, Daniel – Journal of Educational Data Mining, 2017
Computerized classification of student answers offers the possibility of instant feedback and improved learning. Open response (OR) questions provide greater insight into student thinking and understanding than more constrained multiple choice (MC) questions, but development of automated classifiers is more difficult, often requiring training a…
Descriptors: Classification, Computer Assisted Testing, Multiple Choice Tests, Test Format
Lin, Min-Jin; Guo, Chorng-Jee; Hsu, Chia-Er – Online Submission, 2011
This study designed and developed a CP-MCT (content-rich, photo-based multiple choice online test) to assess whether college students can apply the basic light concept to interpret daily light phenomena. One hundred college students volunteered to take the CP-MCT, and the results were statistically analyzed by applying t-test or ANOVA (Analysis of…
Descriptors: College Students, Testing, Multiple Choice Tests, Evaluation Methods
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Susan Rodrigues; Neil Taylor; Margaret Cameron; Lorraine Syme-Smith; Colette Fortuna – Science Education International, 2010
This paper reports on data collected via an audience response system, where a convenience sample of 300 adults aged 17-50 pressed a button to register their answers for twenty multiple choice questions. The responses were then discussed with the respondents at the time. The original dataset includes physics, biology and chemistry questions. The…
Descriptors: Audience Response, International Studies, Familiarity, Chemistry
Peer reviewed Peer reviewed
Direct linkDirect link
Lau, Paul Ngee Kiong; Lau, Sie Hoe; Hong, Kian Sam; Usop, Hasbee – Educational Technology & Society, 2011
The number right (NR) method, in which students pick one option as the answer, is the conventional method for scoring multiple-choice tests that is heavily criticized for encouraging students to guess and failing to credit partial knowledge. In addition, computer technology is increasingly used in classroom assessment. This paper investigates the…
Descriptors: Guessing (Tests), Multiple Choice Tests, Computers, Scoring
Sie Hoe, Lau; Ngee Kiong, Lau; Kian Sam, Hong; Bin Usop, Hasbee – Online Submission, 2009
Assessment is central to any educational process. Number Right (NR) scoring method is a conventional scoring method for multiple choice items, where students need to pick one option as the correct answer. One point is awarded for the correct response and zero for any other responses. However, it has been heavily criticized for guessing and failure…
Descriptors: Multiple Choice Tests, Computer Assisted Testing, Adaptive Testing, Scoring