NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 8 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Yang, Chunliang; Li, Jiaojiao; Zhao, Wenbo; Luo, Liang; Shanks, David R. – Educational Psychology Review, 2023
Practice testing is a powerful tool to consolidate long-term retention of studied information, facilitate subsequent learning of new information, and foster knowledge transfer. However, practitioners frequently express the concern that tests are anxiety-inducing and that their employment in the classroom should be minimized. The current review…
Descriptors: Tests, Test Format, Testing, Test Wiseness
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Guo, Hongwen; Rios, Joseph A.; Ling, Guangming; Wang, Zhen; Gu, Lin; Yang, Zhitong; Liu, Lydia O. – ETS Research Report Series, 2022
Different variants of the selected-response (SR) item type have been developed for various reasons (i.e., simulating realistic situations, examining critical-thinking and/or problem-solving skills). Generally, the variants of SR item format are more complex than the traditional multiple-choice (MC) items, which may be more challenging to test…
Descriptors: Test Format, Test Wiseness, Test Items, Item Response Theory
Lina Anaya; Nagore Iriberri; Pedro Rey-Biel; Gema Zamarro – Annenberg Institute for School Reform at Brown University, 2021
Standardized assessments are widely used to determine access to educational resources with important consequences for later economic outcomes in life. However, many design features of the tests themselves may lead to psychological reactions influencing performance. In particular, the level of difficulty of the earlier questions in a test may…
Descriptors: Test Construction, Test Wiseness, Test Items, Difficulty Level
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Papanastasiou, Elena C. – Practical Assessment, Research & Evaluation, 2015
If good measurement depends in part on the estimation of accurate item characteristics, it is essential that test developers become aware of discrepancies that may exist on the item parameters before and after item review. The purpose of this study was to examine the answer changing patterns of students while taking paper-and-pencil multiple…
Descriptors: Psychometrics, Difficulty Level, Test Items, Multiple Choice Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Kim, Sooyeon; Moses, Tim – ETS Research Report Series, 2014
The purpose of this study was to investigate the potential impact of misrouting under a 2-stage multistage test (MST) design, which includes 1 routing and 3 second-stage modules. Simulations were used to create a situation in which a large group of examinees took each of the 3 possible MST paths (high, middle, and low). We compared differences in…
Descriptors: Comparative Analysis, Difficulty Level, Scores, Test Wiseness
Peer reviewed Peer reviewed
Direct linkDirect link
Plassmann, Sibylle; Zeidler, Beate – Language Learning in Higher Education, 2014
Language testing means taking decisions: about the test taker's results, but also about the test construct and the measures taken in order to ensure quality. This article takes the German test "telc Deutsch C1 Hochschule" as an example to illustrate this decision-making process in an academic context. The test is used for university…
Descriptors: Language Tests, Test Wiseness, Test Construction, Decision Making
Plake, Barbara S.; And Others – 1981
Effects of item arrangement (easy-hard, uniform, and random), test anxiety, and sex on a 48-item multiple-choice mathematics test assembled from items of the American College Testing Program and taken by motivated upper level undergraduates and beginning graduate students were investigated. Four measures of anxiety were used: the Achievement Test…
Descriptors: Academic Achievement, Achievement Tests, Difficulty Level, Higher Education
Plake, Barbara S.; And Others – 1980
Number right and elimination scores were analyzed on a 48-item college level mathematics test that was assembled from pretest data in three forms by varying the item orderings: easy-hard, uniform, or random. Half of the forms contained information explaining the item arrangement and suggesting strategies for taking the test. Several anxiety…
Descriptors: Difficulty Level, Higher Education, Multiple Choice Tests, Quantitative Tests