NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 8 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Guo, Wenjing; Wind, Stefanie A. – Journal of Educational Measurement, 2021
The use of mixed-format tests made up of multiple-choice (MC) items and constructed response (CR) items is popular in large-scale testing programs, including the National Assessment of Educational Progress (NAEP) and many district- and state-level assessments in the United States. Rater effects, or raters' scoring tendencies that result in…
Descriptors: Test Format, Multiple Choice Tests, Scoring, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Xin Wei – Grantee Submission, 2024
This empirical research study investigates the relationship between the utilization of Universal Design (UD) elements and math performance among eighth graders. We analyzed 2017 National Assessment of Educational Progress process data using Poisson Generalized Linear Mixed-Effects Models to examine how the frequency of UD element usage varies…
Descriptors: Access to Education, Student Diversity, Low Achievement, Students with Disabilities
Peer reviewed Peer reviewed
Direct linkDirect link
Woodcock, Stuart; Howard, Steven J.; Ehrich, John – School Psychology, 2020
Standardized testing is ubiquitous in educational assessment, but questions have been raised about the extent to which these test scores accurately reflect students' genuine knowledge and skills. To more rigorously investigate this issue, the current study employed a within-subject experimental design to examine item format effects on primary…
Descriptors: Elementary School Students, Grade 3, Test Items, Test Format
Reardon, Sean F.; Kalogrides, Demetra; Fahle, Erin M.; Podolsky, Anne; Zárate, Rosalía C. – Educational Researcher, 2018
Prior research suggests that males outperform females, on average, on multiple-choice items compared to their relative performance on constructed-response items. This paper characterizes the extent to which gender achievement gaps on state accountability tests across the United States are associated with those tests' item formats. Using roughly 8…
Descriptors: Test Items, Test Format, Gender Differences, Achievement Gap
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Reardon, Sean; Fahle, Erin; Kalogrides, Demetra; Podolsky, Anne; Zarate, Rosalia – Society for Research on Educational Effectiveness, 2016
Prior research demonstrates the existence of gender achievement gaps and the variation in the magnitude of these gaps across states. This paper characterizes the extent to which the variation of gender achievement gaps on standardized tests across the United States can be explained by differing state accountability test formats. A comprehensive…
Descriptors: Test Format, Gender Differences, Achievement Gap, Standardized Tests
National Assessment Governing Board, 2017
The National Assessment of Educational Progress (NAEP) is the only continuing and nationally representative measure of trends in academic achievement of U.S. elementary and secondary school students in various subjects. For more than four decades, NAEP assessments have been conducted periodically in reading, mathematics, science, writing, U.S.…
Descriptors: Mathematics Achievement, Multiple Choice Tests, National Competency Tests, Educational Trends
Bolger, Niall – 1984
This study tests the hypothesis of a gender difference in academic achievement as a function of measurement method. The biasing influence of measurement method on achievement has been recognized. Campbell and Fiske (1959) suggested that a considerable proportion of the variation in test scores may be due to features of the form of test (method)…
Descriptors: Academic Achievement, Essay Tests, Foreign Countries, Multiple Choice Tests
Bay, Luz – 1998
A study was conducted to investigate the difference in student performance on multiple choice (MC) and constructed response (CR) items relative to the achievement levels of the National Assessment of Educational Progress (NAEP). The study included an investigation of how estimates of student performance were affected by item response theory (IRT)…
Descriptors: Academic Achievement, Comparative Analysis, Constructed Response, Cutting Scores