NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 6 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Guo, Wenjing; Wind, Stefanie A. – Journal of Educational Measurement, 2021
The use of mixed-format tests made up of multiple-choice (MC) items and constructed response (CR) items is popular in large-scale testing programs, including the National Assessment of Educational Progress (NAEP) and many district- and state-level assessments in the United States. Rater effects, or raters' scoring tendencies that result in…
Descriptors: Test Format, Multiple Choice Tests, Scoring, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Sangwin, Christopher J.; Jones, Ian – Educational Studies in Mathematics, 2017
In this paper we report the results of an experiment designed to test the hypothesis that when faced with a question involving the inverse direction of a reversible mathematical process, students solve a multiple-choice version by verifying the answers presented to them by the direct method, not by undertaking the actual inverse calculation.…
Descriptors: Mathematics Achievement, Mathematics Tests, Multiple Choice Tests, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Tetteh, Godson Ayertei; Sarpong, Frederick Asafo-Adjei – Journal of International Education in Business, 2015
Purpose: The purpose of this paper is to explore the influence of constructivism on assessment approach, where the type of question (true or false, multiple-choice, calculation or essay) is used productively. Although the student's approach to learning and the teacher's approach to teaching are concepts that have been widely researched, few…
Descriptors: Foreign Countries, Outcomes of Education, Student Evaluation, Test Format
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Wang, Zhen; Yao, Lihua – ETS Research Report Series, 2013
The current study used simulated data to investigate the properties of a newly proposed method (Yao's rater model) for modeling rater severity and its distribution under different conditions. Our study examined the effects of rater severity, distributions of rater severity, the difference between item response theory (IRT) models with rater effect…
Descriptors: Test Format, Test Items, Responses, Computation
Peer reviewed Peer reviewed
Direct linkDirect link
Quenette, Mary A.; Nicewander, W. Alan; Thomasson, Gary L. – Applied Psychological Measurement, 2006
Model-based equating was compared to empirical equating of an Armed Services Vocational Aptitude Battery (ASVAB) test form. The model-based equating was done using item pretest data to derive item response theory (IRT) item parameter estimates for those items that were retained in the final version of the test. The analysis of an ASVAB test form…
Descriptors: Item Response Theory, Multiple Choice Tests, Test Items, Computation
Peer reviewed Peer reviewed
Schoen, Harold L.; And Others – Journal for Research in Mathematics Education, 1990
Describes responses of fifth to eighth grade students to different types of test items requiring estimation. Reports that performance differed by item format, types of numbers and operations in the items, and grade level of students. (Author/YP)
Descriptors: Cognitive Processes, Computation, Elementary School Mathematics, Elementary Secondary Education