ERIC Number: EJ1293619
Record Type: Journal
Publication Date: 2021
Pages: 25
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-2156-7069
EISSN: N/A
Available Date: N/A
Selecting Student-Authored Questions for Summative Assessments
Huang, Alice; Hancock, Dale; Clemson, Matthew; Yeo, Giselle; Harney, Dylan; Denny, Paul; Denyer, Gareth
Research in Learning Technology, v29 2021
Production of high-quality multiple-choice questions (MCQs) for both formative and summative assessments is a time-consuming task requiring great skill, creativity and insight. The transition to online examinations, with the concomitant exposure of previously tried-and-tested MCQs, exacerbates the challenges of question production and highlights the need for innovative solutions. Several groups have shown that it is practical to leverage the student cohort to produce a very large number of syllabus-aligned MCQs for study banks. Although student-generated questions are well suited for formative feedback and practice activities, they are generally not thought to be suitable for high-stakes assessments. In this study, we aimed to demonstrate that training can be provided to students in a scalable fashion to generate questions of similar quality to those produced by experts and that identification of suitable questions can be achieved with minimal academic review and editing. Second-year biochemistry and molecular biology students were assigned a series of activities designed to coach them in the art of writing and critiquing MCQs. This training resulted in the production of over 1000 MCQs that were then gauged for potential by either expert academic judgement or via a data-driven approach in which the questions were trialled objectively in a low-stakes test. Questions selected by either method were then deployed in a high-stakes in-semester assessment alongside questions from two academically authored sources: textbook-derived MCQs and past paper questions. A total of 120 MCQs from these four sources were deployed in assessments attempted by over 600 students. Each question was subjected to rigorous performance analysis, including the calculation of standard metrics from classical test theory and more sophisticated item response theory (IRT) measures. The results showed that MCQs authored by students, and selected at low cost, performed as well as questions authored by academics, illustrating the potential of this strategy for the efficient creation of large numbers of high-quality MCQs for summative assessment.
Descriptors: Summative Evaluation, Student Developed Materials, Test Construction, Multiple Choice Tests, Item Banks, College Students, Chemistry, High Stakes Tests, Foreign Countries, Academic Achievement, Item Response Theory
Association for Learning Technology. Gipsy Lane, Headington, Oxford OX3 0BO, UK. e-mail: enquiries@alt.ac.uk; Web site: https://journal.alt.ac.uk
Publication Type: Journal Articles; Reports - Research
Education Level: Higher Education; Postsecondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Location: Australia
Grant or Contract Numbers: N/A
Author Affiliations: N/A