NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 5 results Save | Export
Parry, James R. – Online Submission, 2020
This paper presents research and provides a method to ensure that parallel assessments, that are generated from a large test-item database, maintain equitable difficulty and content coverage each time the assessment is presented. To maintain fairness and validity it is important that all instances of an assessment, that is intended to test the…
Descriptors: Culture Fair Tests, Difficulty Level, Test Items, Test Validity
Peer reviewed Peer reviewed
Direct linkDirect link
Towns, Marcy H. – Journal of Chemical Education, 2014
Chemistry faculty members are highly skilled in obtaining, analyzing, and interpreting physical measurements, but often they are less skilled in measuring student learning. This work provides guidance for chemistry faculty from the research literature on multiple-choice item development in chemistry. Areas covered include content, stem, and…
Descriptors: Multiple Choice Tests, Test Construction, Psychometrics, Test Items
National Assessment Governing Board, 2008
An assessment framework is like a blueprint, laying out the basic design of the assessment by describing the mathematics content that should be tested and the types of assessment questions that should be included. It also describes how the various design factors should be balanced across the assessment. This is an assessment framework, not a…
Descriptors: Test Items, Student Evaluation, National Competency Tests, Data Analysis
Kitao, Kenji; Kitao, S. Kathleen – 1996
After tests are administered, they are scored and the scores are given back to the students. If the real purpose of the test is to improve student learning, simply returning the scores is not sufficient. The first step in evaluating test results is to be sure that the test has tested the intended concepts and content. Calculating the mean and the…
Descriptors: Difficulty Level, English (Second Language), Evaluation Methods, Feedback
Wu, Margaret; Donovan, Jenny; Hutton, Penny; Lennon, Melissa – Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2008
In July 2001, the Ministerial Council on Education, Employment, Training and Youth Affairs (MCEETYA) agreed to the development of assessment instruments and key performance measures for reporting on student skills, knowledge and understandings in primary science. It directed the newly established Performance Measurement and Reporting Taskforce…
Descriptors: Foreign Countries, Scientific Literacy, Science Achievement, Comparative Analysis