Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 1 |
| Since 2007 (last 20 years) | 5 |
Descriptor
Source
| Educational Testing Service | 5 |
Author
| Baron, Patricia | 1 |
| Dorans, Neil J. | 1 |
| Fife, James H. | 1 |
| Graf, Edith Aurora | 1 |
| Haberman, Shelby J. | 1 |
| Jia, Helena | 1 |
| Ohls, Sarah | 1 |
| Qu, Yanxuan | 1 |
| Rose, Norman | 1 |
| Sinharay, Sandip | 1 |
| Tan, Xuan | 1 |
| More ▼ | |
Publication Type
| Numerical/Quantitative Data | 5 |
| Reports - Research | 4 |
| Reports - Evaluative | 1 |
Education Level
| Junior High Schools | 2 |
| Middle Schools | 2 |
| Elementary Education | 1 |
| Elementary Secondary Education | 1 |
| Grade 7 | 1 |
| Grade 8 | 1 |
| High Schools | 1 |
| Secondary Education | 1 |
Audience
Location
| California | 1 |
Laws, Policies, & Programs
Assessments and Surveys
| Program for International… | 1 |
What Works Clearinghouse Rating
Weeks, Jonathan; Baron, Patricia – Educational Testing Service, 2021
The current project, Exploring Math Education Relations by Analyzing Large Data Sets (EMERALDS) II, is an attempt to identify specific Common Core State Standards procedural, conceptual, and problem-solving competencies in earlier grades that best predict success in algebraic areas in later grades. The data for this study include two cohorts of…
Descriptors: Mathematics Education, Common Core State Standards, Problem Solving, Mathematics Tests
Fife, James H.; Graf, Edith Aurora; Ohls, Sarah – Educational Testing Service, 2011
Six tasks, selected from assessments administered in 2007 as part of the Cognitively-Based Assessments of, for, and as Learning (CBAL) project, were revised in an effort to remove difficulties with the tasks that were unrelated to the construct being assessed. Because the revised tasks were piloted on a different population from the original…
Descriptors: Mathematics Tests, Responses, Test Construction, Construct Validity
Tan, Xuan; Xiang, Bihua; Dorans, Neil J.; Qu, Yanxuan – Educational Testing Service, 2010
The nature of the matching criterion (usually the total score) in the study of differential item functioning (DIF) has been shown to impact the accuracy of different DIF detection procedures. One of the topics related to the nature of the matching criterion is whether the studied item should be included. Although many studies exist that suggest…
Descriptors: Test Bias, Test Items, Item Response Theory
Rose, Norman; von Davier, Matthias; Xu, Xueli – Educational Testing Service, 2010
Large-scale educational surveys are low-stakes assessments of educational outcomes conducted using nationally representative samples. In these surveys, students do not receive individual scores, and the outcome of the assessment is inconsequential for respondents. The low-stakes nature of these surveys, as well as variations in average performance…
Descriptors: Item Response Theory, Educational Assessment, Data Analysis, Case Studies
Sinharay, Sandip; Haberman, Shelby J.; Jia, Helena – Educational Testing Service, 2011
Standard 3.9 of the "Standards for Educational and Psychological Testing" (American Educational Research Association, American Psychological Association, & National Council for Measurement in Education, 1999) demands evidence of model fit when an item response theory (IRT) model is used to make inferences from a data set. We applied two recently…
Descriptors: Item Response Theory, Goodness of Fit, Statistical Analysis, Language Tests


