ERIC Number: ED417199
Record Type: Non-Journal
Publication Date: 1997-Oct
Pages: 120
Abstractor: N/A
ISBN: N/A
ISSN: N/A
EISSN: N/A
Available Date: N/A
ACT's NAEP Redesign Project: Assessment Design Is the Key to Useful and Stable Assessment Results. Working Paper Series.
Bay, Luz; Chen, Lee; Hanson, Bradley A.; Happel, Jay; Kolen, Michael J.; Miller, Timothy; Pommerich, Mary; Sconing, James; Wang, Tianyou; Welch, Catherine
This report presents an investigation by the American College Testing Program (ACT) of an alternative design for the National Assessment of Educational Progress (NAEP). The proposed design greatly simplifies the data collection and analysis procedures needed to produce assessment results and has the potential to produce results that are more timely and easier to interpret. The plan calls for developing individual NAEP forms, where each individual form represents, as closely as possible, the assessment questions from the domain of knowledge being measured by an NAEP construct. Sets of these forms could be administered, in random order, to students in the schools. This would replace the balanced incomplete block design (BIB) currently used. Assessments constructed under the BIB design do not closely represent, at least for the 1996 science assessment, the content framework. Enhanced procedures are also suggested for developing precise content and statistical specifications for individual forms and procedures for pretesting items. The basic scores that ACT suggests using for producing group assessment results are calculated by weighting item scores from multiple-choice and constructed-response items, where the weights are determined, a priori, by content specialists. These weights should relate more closely to the weighting intended by content specialists than do current NAEP weights. Scaling, equating, and score distribution estimation methods are described that rely on less stringent psychometric and statistical assumptions than do current procedures. Issues in sampling that include sample size requirements, sample design, and estimating standard errors are also examined, as are procedures for reporting score distributions that reflect group performance on content domains. The feasibility of using such domain scores to measure trends and to facilitate setting NAEP standards is explored. ACT recommends focusing on the design of assessments and the data collection methods rather than on complex analysis procedures. (Contains 18 tables, 9 figures, and 41 references.) (SLD)
Descriptors: Data Analysis, Data Collection, Educational Assessment, Elementary Secondary Education, Equated Scores, Estimation (Mathematics), National Surveys, Research Design, Research Methodology, Sample Size, Sampling, Scaling, Standards, Trend Analysis
U.S. Department of Education, Office of Educational Research and Improvement, National Center for Education Statistics, 555 New Jersey Avenue, N.W., Room 400, Washington, DC 20208-5654.
Publication Type: Reports - Evaluative
Education Level: N/A
Audience: N/A
Language: English
Sponsor: National Center for Education Statistics (ED), Washington, DC.
Authoring Institution: American Coll. Testing Program, Iowa City, IA.
Identifiers - Assessments and Surveys: National Assessment of Educational Progress
Grant or Contract Numbers: N/A
Author Affiliations: N/A