Publication Date
In 2025 | 0 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 4 |
Since 2016 (last 10 years) | 5 |
Since 2006 (last 20 years) | 6 |
Descriptor
Computer Assisted Testing | 7 |
Test Format | 7 |
College Students | 4 |
Engineering Education | 3 |
Foreign Countries | 3 |
Student Evaluation | 3 |
Cheating | 2 |
Comparative Analysis | 2 |
Student Attitudes | 2 |
Tests | 2 |
Access to Computers | 1 |
More ▼ |
Source
Assessment & Evaluation in… | 7 |
Author
Publication Type
Journal Articles | 7 |
Reports - Research | 4 |
Reports - Evaluative | 2 |
Information Analyses | 1 |
Education Level
Higher Education | 5 |
Postsecondary Education | 5 |
Audience
Location
China | 1 |
Norway | 1 |
United Kingdom (England) | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Phillip Dawson; Kelli Nicola-Richmond; Helen Partridge – Assessment & Evaluation in Higher Education, 2024
Educators set restrictions in examinations to enable them to assess learning outcomes under particular conditions. The open book versus closed book binary is an example of the sorts of restrictions examiners have traditionally set. In the late 2000s this was expanded to a trinary to include open web examinations. However, the current technology…
Descriptors: Electronic Learning, Computer Assisted Testing, Supervision, Cheating
Chan, Cecilia Ka Yuk – Assessment & Evaluation in Higher Education, 2023
With the advances of technologies, possessing digital and information literacy is crucial for the selection of candidates by employers in this digital AI era. For most students, receiving and outputting electronic text has become the norm, and thus examinations with writing components done by hand may not accurately reflect their abilities. It…
Descriptors: Test Format, Handwriting, Stakeholders, Feedback (Response)
Janse van Rensburg, Cecile; Coetzee, Stephen A.; Schmulian, Astrid – Assessment & Evaluation in Higher Education, 2022
This study reports on the incorporation of mobile instant messaging (MIM) in assessments, as a collaborative learning tool, to enable students to socially construct knowledge and develop their collaborative problem solving competence, while being assessed individually. In particular, this study explores: what is the extent and timing of students'…
Descriptors: Computer Mediated Communication, Student Evaluation, Peer Relationship, Cooperative Learning
Chirumamilla, Aparna; Sindre, Guttorm; Nguyen-Duc, Anh – Assessment & Evaluation in Higher Education, 2020
A concern that has been raised with the transition from pen and paper examinations to electronic examinations is whether this will make cheating easier. This article investigates how teachers and students perceive the differences in ease of cheating during three types of written examination: paper exams, bring your own device e-exams and e-exams…
Descriptors: Cheating, Computer Assisted Testing, Test Format, Testing
Gu, Lin; Ling, Guangming; Liu, Ou Lydia; Yang, Zhitong; Li, Guirong; Kardanova, Elena; Loyalka, Prashant – Assessment & Evaluation in Higher Education, 2021
We examine the effects of computer-based versus paper-based assessment of critical thinking skills, adapted from English (in the U.S.) to Chinese. Using data collected based on a random assignment between the two modes in multiple Chinese colleges, we investigate mode effects from multiple perspectives: mean scores, measurement precision, item…
Descriptors: Critical Thinking, Tests, Test Format, Computer Assisted Testing
Irwin, Brian; Hepplestone, Stuart – Assessment & Evaluation in Higher Education, 2012
There have been calls in the literature for changes to assessment practices in higher education, to increase flexibility and give learners more control over the assessment process. This article explores the possibilities of allowing student choice in the format used to present their work, as a starting point for changing assessment, based on…
Descriptors: Student Evaluation, College Students, Selection, Computer Assisted Testing

Lloyd, D.; And Others – Assessment & Evaluation in Higher Education, 1996
In an engineering technology course at Coventry University (England), the utility of computer-assisted tests was compared with that of traditional paper-based tests. It was found that the computer-based technique was acceptable to students, produced valid results, and demonstrated potential for saving staff time. (Author/MSE)
Descriptors: Comparative Analysis, Computer Assisted Testing, Efficiency, Engineering Education