NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 5 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Gu, Lin; Ling, Guangming; Liu, Ou Lydia; Yang, Zhitong; Li, Guirong; Kardanova, Elena; Loyalka, Prashant – Assessment & Evaluation in Higher Education, 2021
We examine the effects of computer-based versus paper-based assessment of critical thinking skills, adapted from English (in the U.S.) to Chinese. Using data collected based on a random assignment between the two modes in multiple Chinese colleges, we investigate mode effects from multiple perspectives: mean scores, measurement precision, item…
Descriptors: Critical Thinking, Tests, Test Format, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Bush, Martin – Assessment & Evaluation in Higher Education, 2015
The humble multiple-choice test is very widely used within education at all levels, but its susceptibility to guesswork makes it a suboptimal assessment tool. The reliability of a multiple-choice test is partly governed by the number of items it contains; however, longer tests are more time consuming to take, and for some subject areas, it can be…
Descriptors: Guessing (Tests), Multiple Choice Tests, Test Format, Test Reliability
Peer reviewed Peer reviewed
Direct linkDirect link
Craig, Pippa; Gordon, Jill; Clarke, Rufus; Oldmeadow, Wendy – Assessment & Evaluation in Higher Education, 2009
This study aimed to provide evidence to guide decisions on the type and timing of assessments in a graduate medical programme, by identifying whether students from particular degree backgrounds face greater difficulty in satisfying the current assessment requirements. We examined the performance rank of students in three types of assessments and…
Descriptors: Student Evaluation, Medical Education, Student Characteristics, Correlation
Peer reviewed Peer reviewed
Lloyd, D.; And Others – Assessment & Evaluation in Higher Education, 1996
In an engineering technology course at Coventry University (England), the utility of computer-assisted tests was compared with that of traditional paper-based tests. It was found that the computer-based technique was acceptable to students, produced valid results, and demonstrated potential for saving staff time. (Author/MSE)
Descriptors: Comparative Analysis, Computer Assisted Testing, Efficiency, Engineering Education
Peer reviewed Peer reviewed
Logan, Peter; Hazel, Elizabeth – Assessment & Evaluation in Higher Education, 1999
An Australian study examined the role of language background and gender in the assessment of native and non-native English-speaking university students enrolled in physics courses. Attention was given to differential performance and was analyzed in relation to test-item type and level of student communication skills. Implications for academic…
Descriptors: College Students, Comparative Analysis, English (Second Language), Foreign Countries