NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 8 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Boote, Stacy K.; Boote, David N.; Williamson, Steven – Cogent Education, 2021
Several decades of research suggesting differences in test performance across paper-based and computer-based assessments have been largely ameliorated through attention to test presentation equivalence, though no studies to date have focused on graph comprehension items. Test items requiring graph comprehension are increasingly common but may be…
Descriptors: Graduate Students, Masters Programs, Business Administration Education, Graphs
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Liu, Yuming; Robin, Frédéric; Yoo, Hanwook; Manna, Venessa – ETS Research Report Series, 2018
The "GRE"® Psychology test is an achievement test that measures core knowledge in 12 content domains that represent the courses commonly offered at the undergraduate level. Currently, a total score and 2 subscores, experimental and social, are reported to test takers as well as graduate institutions. However, the American Psychological…
Descriptors: College Entrance Examinations, Graduate Study, Psychological Testing, Scores
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Swiggett, Wanda D.; Kotloff, Laurie; Ezzo, Chelsea; Adler, Rachel; Oliveri, Maria Elena – ETS Research Report Series, 2014
The computer-based "Graduate Record Examinations"® ("GRE"®) revised General Test includes interactive item types and testing environment tools (e.g., test navigation, on-screen calculator, and help). How well do test takers understand these innovations? If test takers do not understand the new item types, these innovations may…
Descriptors: College Entrance Examinations, Graduate Study, Usability, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Hanshaw, Larry G. – College Student Journal, 2012
This study sought to determine how students would describe their group-only cooperative testing experiences in terms of key elements of cooperative learning often cited in the literature. Written comments of 159 graduate students were analyzed and 26 related categories of comments were derived from 495 statements of students enrolled in two…
Descriptors: Achievement Gains, Cooperative Learning, Teaching Methods, Graduate Students
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Brown, Kevin – CEA Forum, 2015
In this article, the author describes his project to take every standardized exam English majors students take. During the summer and fall semesters of 2012, the author signed up for and took the GRE General Test, the Praxis Content Area Exam (English Language, Literature, and Composition: Content Knowledge), the Senior Major Field Tests in…
Descriptors: College Faculty, College English, Test Preparation, Standardized Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Craig, Pippa; Gordon, Jill; Clarke, Rufus; Oldmeadow, Wendy – Assessment & Evaluation in Higher Education, 2009
This study aimed to provide evidence to guide decisions on the type and timing of assessments in a graduate medical programme, by identifying whether students from particular degree backgrounds face greater difficulty in satisfying the current assessment requirements. We examined the performance rank of students in three types of assessments and…
Descriptors: Student Evaluation, Medical Education, Student Characteristics, Correlation
Peer reviewed Peer reviewed
Direct linkDirect link
Ponder, Nicole; Beatty, Sharon E.; Foxx, William – Journal of Marketing Education, 2004
Current and emerging issues concerning the written comprehensive exam process are addressed. Both the purpose and structure of this exam are considered. Survey results are presented that describe the purposes of the exam from the perspective of doctoral coordinators. Also included is a description of how marketing departments are currently…
Descriptors: Doctoral Programs, Marketing, Exit Examinations, Business Administration Education
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Gu, Lixiong; Drake, Samuel; Wolfe, Edward W. – Journal of Technology, Learning, and Assessment, 2006
This study seeks to determine whether item features are related to observed differences in item difficulty (DIF) between computer- and paper-based test delivery media. Examinees responded to 60 quantitative items similar to those found on the GRE general test in either a computer-based or paper-based medium. Thirty-eight percent of the items were…
Descriptors: Test Bias, Test Items, Educational Testing, Student Evaluation