NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20260
Since 20250
Since 2022 (last 5 years)0
Since 2017 (last 10 years)2
Since 2007 (last 20 years)9
Audience
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing all 9 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Confrey, Jere; Shah, Meetal; Persson, Jennifer; Ciliano, Dagmara – North American Chapter of the International Group for the Psychology of Mathematics Education, 2019
This paper reports on a design-based implementation study of the use of a diagnostic classroom assessment tool framed on learning trajectories (LTs) for middle grades mathematics, where teachers and students are provided immediate data on students' progress along LTs. The study answers the question: "How can one characterize the challenges…
Descriptors: Middle School Students, Mathematics Instruction, Barriers, Diagnostic Tests
National Assessment Governing Board, 2017
The National Assessment of Educational Progress (NAEP) is the only continuing and nationally representative measure of trends in academic achievement of U.S. elementary and secondary school students in various subjects. For more than four decades, NAEP assessments have been conducted periodically in reading, mathematics, science, writing, U.S.…
Descriptors: Mathematics Achievement, Multiple Choice Tests, National Competency Tests, Educational Trends
Peer reviewed Peer reviewed
Direct linkDirect link
Li, Xueming; Sireci, Stephen G. – Educational and Psychological Measurement, 2013
Validity evidence based on test content is of essential importance in educational testing. One source for such evidence is an alignment study, which helps evaluate the congruence between tested objectives and those specified in the curriculum. However, the results of an alignment study do not always sufficiently capture the degree to which a test…
Descriptors: Content Validity, Multidimensional Scaling, Data Analysis, Educational Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Meyers, Jason L.; Miller, G. Edward; Way, Walter D. – Applied Measurement in Education, 2009
In operational testing programs using item response theory (IRT), item parameter invariance is threatened when an item appears in a different location on the live test than it did when it was field tested. This study utilizes data from a large state's assessments to model change in Rasch item difficulty (RID) as a function of item position change,…
Descriptors: Test Items, Test Content, Testing Programs, Simulation
National Assessment Governing Board, 2010
Since 1973, the National Assessment of Educational Progress (NAEP) has gathered information about student achievement in mathematics. Results of these periodic assessments, produced in print and web-based formats, provide valuable information to a wide variety of audiences. The NAEP Assessment in mathematics has two components that differ in…
Descriptors: Mathematics Achievement, Academic Achievement, Audiences, National Competency Tests
National Assessment Governing Board, 2008
An assessment framework is like a blueprint, laying out the basic design of the assessment by describing the mathematics content that should be tested and the types of assessment questions that should be included. It also describes how the various design factors should be balanced across the assessment. This is an assessment framework, not a…
Descriptors: Test Items, Student Evaluation, National Competency Tests, Data Analysis
Donovan, Jenny; Hutton, Penny; Lennon, Melissa; O'Connor, Gayl; Morrissey, Noni – Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2008
In July 2001, the Ministerial Council on Education, Employment, Training and Youth Affairs (MCEETYA) agreed to the development of assessment instruments and key performance measures for reporting on student skills, knowledge and understandings in primary science. It directed the newly established Performance Measurement and Reporting Taskforce…
Descriptors: Foreign Countries, Scientific Literacy, Media Literacy, Scientific Concepts
Donovan, Jenny; Lennon, Melissa; O'Connor, Gayl; Morrissey, Noni – Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2008
In 2003 the first nationally-comparable science assessment was designed, developed and carried out under the auspices of the national council of education ministers, the Ministerial Council on Education, Employment, Training and Youth Affairs (MCEETYA). In 2006 a second science assessment was conducted and, for the first time nationally, the…
Descriptors: Foreign Countries, Scientific Literacy, Science Achievement, Comparative Analysis
Wu, Margaret; Donovan, Jenny; Hutton, Penny; Lennon, Melissa – Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2008
In July 2001, the Ministerial Council on Education, Employment, Training and Youth Affairs (MCEETYA) agreed to the development of assessment instruments and key performance measures for reporting on student skills, knowledge and understandings in primary science. It directed the newly established Performance Measurement and Reporting Taskforce…
Descriptors: Foreign Countries, Scientific Literacy, Science Achievement, Comparative Analysis