NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 8 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Abdullah Al Fraidan; Meznah Saud Abdulaziz Alsubaie – Educational Process: International Journal, 2025
Background: This study examines the effect of test anxiety on the academic performance of postgraduate female students, focusing on their perceptions and experiences in open-book exams (OBE) and closed-book exams (CBE). Method: A qualitative case study design was employed using the Thinking Aloud Protocol (TAP) to collect data from five Saudi…
Descriptors: Test Anxiety, Vocabulary, Females, Books
Peer reviewed Peer reviewed
Direct linkDirect link
Xu, Yuejin; Iran-Nejad, Asghar; Thoma, Stephen J. – Journal of Interactive Online Learning, 2007
The purpose of the study was to determine comparability of an online version to the original paper-pencil version of Defining Issues Test 2 (DIT2). This study employed methods from both Classical Test Theory (CTT) and Item Response Theory (IRT). Findings from CTT analyses supported the reliability and discriminant validity of both versions.…
Descriptors: Computer Assisted Testing, Test Format, Comparative Analysis, Test Theory
Tollefson, Nona; Chen, Ju Shan – 1986
This study compared item difficulty and item discrimination indices for parallel multiple-choice items in three content areas: measurement concepts, statistical terminology, and synonyms. The statistics and measurement items were administered in classes where graduate students taking the test were studying the content. Vocabulary items represented…
Descriptors: Difficulty Level, Graduate Students, Higher Education, Item Analysis
Tollefson, Nona; Tripp, Alice – 1986
The item difficulty and item discrimination of three multiple-choice item formats were compared in experimental and non-experimental settings. In the experimental study, 104 graduate students were randomly assigned to complete one of three forms of a multiple-choice test: (1) a complex alternative ("none of the above") as the correct answer; (2) a…
Descriptors: Achievement Tests, Difficulty Level, Discriminant Analysis, Graduate Students
Tollefson, Nona; Tripp, Alice – 1983
This study compared the item difficulty and item discrimination of three multiple choice item formats. The multiple choice formats studied were: a complex alternative (none of the above) as the correct answer; a complex alternative as a foil, and the one-correct answer format. One hundred four graduate students were randomly assigned to complete…
Descriptors: Analysis of Variance, Difficulty Level, Graduate Students, Higher Education
Peer reviewed Peer reviewed
Clariana, Roy B. – International Journal of Instructional Media, 2004
This investigation considers the instructional effects of color as an over-arching context variable when learning from computer displays. The purpose of this investigation is to examine the posttest retrieval effects of color as a local, extra-item non-verbal lesson context variable for constructed-response versus multiple-choice posttest…
Descriptors: Instructional Effectiveness, Graduate Students, Color, Computer System Design
Chissom, Brad; Chukabarah, Prince C. O. – 1985
The comparative effects of various sequences of test items were examined for over 900 graduate students enrolled in an educational research course at The University of Alabama, Tuscaloosa. experiment, which was conducted a total of four times using four separate tests, presented three different arrangements of 50 multiple-choice items: (1)…
Descriptors: Analysis of Variance, Comparative Testing, Difficulty Level, Graduate Students
Peer reviewed Peer reviewed
Wise, Steven L.; And Others – Journal of Educational Measurement, 1992
Performance of 156 undergraduate and 48 graduate students on a self-adapted test (SFAT)--students choose the difficulty level of their test items--was compared with performance on a computer-adapted test (CAT). Those taking the SFAT obtained higher ability scores and reported lower posttest state anxiety than did CAT takers. (SLD)
Descriptors: Adaptive Testing, Comparative Testing, Computer Assisted Testing, Difficulty Level