ERIC Number: ED400322
Record Type: Non-Journal
Publication Date: 1996-Apr-10
Pages: 31
Abstractor: N/A
ISBN: N/A
ISSN: N/A
EISSN: N/A
Available Date: N/A
Teacher Assessment Competency: A Rasch Model Analysis.
Zhang, Zhicheng
A 67-item Assessment Practices Inventory (API) was administered to 311 inservice teachers. The application of principal components analysis to the data yielded a 6-factor solution that explained 64% of the variance. The Rasch rating scale model was applied to the API to estimate item calibrations. The factor analyzed assessment categories were then ranked in order by difficulty based on mean logits. The distribution of mean logits ranged from -.35 to 0.78. Communicating assessment results was the easiest assessment category. Interpreting standardized test results, conducting classroom statistics, and using assessment results in decision making constituted the most difficult assessment categories. Nonachievement-based grading was more difficult than recommended grading practices, and performance assessment was more difficult than paper-pencil tests. The identification of the hierarchy of classroom assessment categories provided useful information for measurement training and teacher education in assessment. The findings justified ongoing research on grading practices, and supported the call in the assessment community for a shift of instructional emphasis from traditional objective tests to alternative assessments. (Contains 2 figures, 7 tables, and 53 references.) (Author/SLD)
Descriptors: Alternative Assessment, Decision Making, Difficulty Level, Educational Assessment, Educational Practices, Factor Analysis, Factor Structure, Grading, Item Response Theory, Rating Scales, Standardized Tests, Teacher Competencies, Teacher Education, Test Construction, Test Interpretation, Test Results
Publication Type: Reports - Research; Speeches/Meeting Papers
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A
Author Affiliations: N/A