ERIC Number: EJ1462710
Record Type: Journal
Publication Date: 2023
Pages: 11
Abstractor: As Provided
ISBN: N/A
ISSN: N/A
EISSN: EISSN-2161-4210
Available Date: 0000-00-00
Is It Actually Reliable? Examining Statistical Methods for Inter-Rater Reliability of a Rubric in Graduate Education
Brent J. Goertzen; Kaley Klaus
Research & Practice in Assessment, v18 n2 p31-41 2023
When evaluating student learning, educators often employ scoring rubrics, for which quality can be determined through evaluating validity and reliability. This article discusses the norming process utilized in a graduate organizational leadership program for a capstone scoring rubric. Concepts of validity and reliability are discussed, as is the development of a scoring rubric. Various statistical measures of inter-rater reliability are presented and effectiveness of those measures are discussed. Our findings indicated that inter-rater reliability can be achieved in graduate scoring rubrics, though the strength of reliability varies substantially based on the selected statistical measure. Recommendations for determining validity and measuring inter-rater reliability among multiple raters and rater pairs in assessment practices, among other considerations in rubric development, are provided.
Descriptors: Graduate Students, Graduate Study, Graduate School Faculty, Scoring Rubrics, Test Validity, Test Reliability, Interrater Reliability, Leadership, Robustness (Statistics), Test Construction
Virginia Assessment Group.; Fax: 504-247-1232; e-mail: editor@rpajournal.com; Web site: http://www.rpajournal.com/
Publication Type: Journal Articles; Reports - Research
Education Level: Higher Education; Postsecondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A
Author Affiliations: N/A