Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 0 |
| Since 2007 (last 20 years) | 1 |
Descriptor
Author
| Beatty, Sharon E. | 1 |
| Clarke, Rufus | 1 |
| Craig, Pippa | 1 |
| Drake, Samuel | 1 |
| Foxx, William | 1 |
| Gordon, Jill | 1 |
| Gu, Lixiong | 1 |
| Oldmeadow, Wendy | 1 |
| Ponder, Nicole | 1 |
| Wolfe, Edward W. | 1 |
Publication Type
| Journal Articles | 3 |
| Reports - Research | 2 |
| Reports - Evaluative | 1 |
Education Level
| Higher Education | 3 |
| Postsecondary Education | 2 |
Audience
Location
| Australia | 1 |
Laws, Policies, & Programs
Assessments and Surveys
| Graduate Record Examinations | 1 |
What Works Clearinghouse Rating
Craig, Pippa; Gordon, Jill; Clarke, Rufus; Oldmeadow, Wendy – Assessment & Evaluation in Higher Education, 2009
This study aimed to provide evidence to guide decisions on the type and timing of assessments in a graduate medical programme, by identifying whether students from particular degree backgrounds face greater difficulty in satisfying the current assessment requirements. We examined the performance rank of students in three types of assessments and…
Descriptors: Student Evaluation, Medical Education, Student Characteristics, Correlation
Ponder, Nicole; Beatty, Sharon E.; Foxx, William – Journal of Marketing Education, 2004
Current and emerging issues concerning the written comprehensive exam process are addressed. Both the purpose and structure of this exam are considered. Survey results are presented that describe the purposes of the exam from the perspective of doctoral coordinators. Also included is a description of how marketing departments are currently…
Descriptors: Doctoral Programs, Marketing, Exit Examinations, Business Administration Education
Gu, Lixiong; Drake, Samuel; Wolfe, Edward W. – Journal of Technology, Learning, and Assessment, 2006
This study seeks to determine whether item features are related to observed differences in item difficulty (DIF) between computer- and paper-based test delivery media. Examinees responded to 60 quantitative items similar to those found on the GRE general test in either a computer-based or paper-based medium. Thirty-eight percent of the items were…
Descriptors: Test Bias, Test Items, Educational Testing, Student Evaluation

Peer reviewed
Direct link
