ERIC Number: ED667286
Record Type: Non-Journal
Publication Date: 2021
Pages: 133
Abstractor: As Provided
ISBN: 979-8-5160-7144-7
ISSN: N/A
EISSN: N/A
Available Date: 0000-00-00
Grading in Chemistry: Variations in Instructors' Evaluation of Student Written Responses
Michelle Herridge
ProQuest LLC, Ph.D. Dissertation, The University of Arizona
Evaluation of student written work during summative assessments is an important and critical task for instructors at all educational levels. Nevertheless, few research studies exist that provide insights into how different instructors approach this task. Chemistry faculty (FIs) and graduate student instructors (GSIs) regularly engage in the evaluation and grading of student written responses in formal formative and summative assessments. In this study, we characterized how different instructors teaching general chemistry at the University of Arizona differed in their approaches to the evaluation and grading of students' written answers in midterm exams. This work is critical to support the professional development of instructors to ensure a fair evaluation of all students. In the first part of this project, we identified and characterized dimensions of variation in general chemistry instructors' approaches to the evaluation and grading of a conceptual question. Using qualitative methods of research, we conducted individual interviews in which participating chemistry FIs and GSIs were asked to evaluate and grade the same set of students' responses to a typical exam prompt and justify their decisions. Our results showed that observed variability in assigned grades emerged from the complex interaction of explicit and implicit decisions made and preferences manifested along various dimensions. Based on this characterization, we developed an analytical framework to characterize variation in the evaluation approaches of chemistry instructors when grading free response questions. In the second stage of this project, we applied the analytical framework to characterize variation in the evaluation and grading of a diverse set of open response questions and to analyze the impact of such variations on assigned grades. Our results revealed a wide variation in the approaches followed by instructors along different dimensions, although most instructors tended to be consistent in the approaches they individually followed when analyzing different student responses to diverse questions. Instructors' experience seemed to impact assigned grades but relevant dimensions of variation in the evaluation differed for less and more experienced instructors. [The dissertation citations contained here are published with the permission of ProQuest LLC. Further reproduction is prohibited without permission. Copies of dissertations may be obtained by Telephone (800) 1-800-521-0600. Web page: http://www.proquest.com.bibliotheek.ehb.be/en-US/products/dissertations/individuals.shtml.]
Descriptors: Science Instruction, Chemistry, College Faculty, Teaching Assistants, Writing Assignments, Student Evaluation, Writing Evaluation, Grading, Interrater Reliability, Professional Development, Test Reliability, Essay Tests, Evaluation Criteria, Evaluation Problems
ProQuest LLC. 789 East Eisenhower Parkway, P.O. Box 1346, Ann Arbor, MI 48106. Tel: 800-521-0600; Web site: http://www.proquest.com.bibliotheek.ehb.be/en-US/products/dissertations/individuals.shtml
Publication Type: Dissertations/Theses - Doctoral Dissertations
Education Level: Higher Education; Postsecondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Location: Arizona
Grant or Contract Numbers: N/A
Author Affiliations: N/A