Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 3 |
Descriptor
Difficulty Level | 5 |
Test Format | 5 |
Test Items | 3 |
Computer Assisted Testing | 2 |
High School Students | 2 |
Item Response Theory | 2 |
Ability | 1 |
Academic Ability | 1 |
Adaptive Testing | 1 |
Cognitive Processes | 1 |
College Students | 1 |
More ▼ |
Source
Applied Measurement in… | 5 |
Author
Ascalon, M. Evelina | 1 |
Cho, Hyun-Jeong | 1 |
Davis, Bruce W. | 1 |
Kingston, Neal | 1 |
Kobrin, Jennifer L. | 1 |
Lee, Jaehoon | 1 |
Lee, Won-Chan | 1 |
Lim, Euijin | 1 |
Meyers, Lawrence S. | 1 |
Olea, Julio | 1 |
Ponsoda, Vicente | 1 |
More ▼ |
Publication Type
Journal Articles | 5 |
Reports - Research | 5 |
Education Level
Elementary Education | 1 |
Elementary Secondary Education | 1 |
Grade 3 | 1 |
Grade 4 | 1 |
Grade 5 | 1 |
Grade 6 | 1 |
Grade 7 | 1 |
Grade 8 | 1 |
High Schools | 1 |
Junior High Schools | 1 |
Middle Schools | 1 |
More ▼ |
Audience
Location
Spain | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Lim, Euijin; Lee, Won-Chan – Applied Measurement in Education, 2020
The purpose of this study is to address the necessity of subscore equating and to evaluate the performance of various equating methods for subtests. Assuming the random groups design and number-correct scoring, this paper analyzed real data and simulated data with four study factors including test dimensionality, subtest length, form difference in…
Descriptors: Equated Scores, Test Length, Test Format, Difficulty Level
Cho, Hyun-Jeong; Lee, Jaehoon; Kingston, Neal – Applied Measurement in Education, 2012
This study examined the validity of test accommodation in third-eighth graders using differential item functioning (DIF) and mixture IRT models. Two data sets were used for these analyses. With the first data set (N = 51,591) we examined whether item type (i.e., story, explanation, straightforward) or item features were associated with item…
Descriptors: Testing Accommodations, Test Bias, Item Response Theory, Validity
Ascalon, M. Evelina; Meyers, Lawrence S.; Davis, Bruce W.; Smits, Niels – Applied Measurement in Education, 2007
This article examined two item-writing guidelines: the format of the item stem and homogeneity of the answer set. Answering the call of Haladyna, Downing, and Rodriguez (2002) for empirical tests of item writing guidelines and extending the work of Smith and Smith (1988) on differential use of item characteristics, a mock multiple-choice driver's…
Descriptors: Guidelines, Difficulty Level, Standard Setting, Driver Education

Kobrin, Jennifer L.; Young, John W. – Applied Measurement in Education, 2003
Studied the cognitive equivalence of computerized and paper-and-pencil reading comprehension tests using verbal protocol analysis. Results for 48 college students indicate that the only significant difference between the computerized and paper-and-pencil tests was in the frequency of identifying important information in the passage. (SLD)
Descriptors: Cognitive Processes, College Students, Computer Assisted Testing, Difficulty Level

Ponsoda, Vicente; Olea, Julio; Rodriguez, Maria Soledad; Revuelta, Javier – Applied Measurement in Education, 1999
Compared easy and difficult versions of self-adapted tests (SAT) and computerized adapted tests. No significant differences were found among the tests for estimated ability or posttest state anxiety in studies with 187 Spanish high school students, although other significant differences were found. Discusses implications for interpreting test…
Descriptors: Ability, Adaptive Testing, Comparative Analysis, Computer Assisted Testing