Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 1 |
Descriptor
Test Construction | 2 |
Test Format | 2 |
Test Items | 2 |
Best Practices | 1 |
Cultural Differences | 1 |
Evidence | 1 |
Familiarity | 1 |
Feedback (Response) | 1 |
Gender Bias | 1 |
Item Analysis | 1 |
Language Usage | 1 |
More ▼ |
Source
Educational Assessment | 2 |
Publication Type
Journal Articles | 2 |
Information Analyses | 1 |
Reports - Evaluative | 1 |
Reports - Research | 1 |
Education Level
Elementary Secondary Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Lakin, Joni M. – Educational Assessment, 2014
The purpose of test directions is to familiarize examinees with a test so that they respond to items in the manner intended. However, changes in educational measurement as well as the U.S. student population present new challenges to test directions and increase the impact that differential familiarity could have on the validity of test score…
Descriptors: Test Content, Test Construction, Best Practices, Familiarity
Zenisky, April L.; Hambleton, Ronald K.; Robin, Frederic – Educational Assessment, 2004
Differential item functioning (DIF) analyses are a routine part of the development of large-scale assessments. Less common are studies to understand the potential sources of DIF. The goals of this study were (a) to identify gender DIF in a large-scale science assessment and (b) to look for trends in the DIF and non-DIF items due to content,…
Descriptors: Program Effectiveness, Test Format, Science Tests, Test Items