NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 7 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Caspari-Sadeghi, Sima; Forster-Heinlein, Brigitte; Maegdefrau, Jutta; Bachl, Lena – International Journal for the Scholarship of Teaching and Learning, 2021
This action research study presents the findings of using a formative assessment strategy in an online mathematic course during the world-wide outbreak of COVID-19 at the University of Passau, Germany. The main goals of this study were: (1) to enhance students' self-regulated learning by shifting the direction of assessment from instructors to the…
Descriptors: Foreign Countries, Online Courses, COVID-19, Pandemics
Steedle, Jeffrey; Pashley, Peter; Cho, YoungWoo – ACT, Inc., 2020
Three mode comparability studies were conducted on the following Saturday national ACT test dates: October 26, 2019, December 14, 2019, and February 8, 2020. The primary goal of these studies was to evaluate whether ACT scores exhibited mode effects between paper and online testing that would necessitate statistical adjustments to the online…
Descriptors: Test Format, Computer Assisted Testing, College Entrance Examinations, Scores
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Kiliçkaya, Ferit – Teaching English with Technology, 2017
This study aimed to determine EFL (English as a Foreign Language) teachers' perceptions and experience regarding their use of "GradeCam Go!" to grade multiple choice tests. The results of the study indicated that the participants overwhelmingly valued "GradeCam Go!" due to its features such as grading printed forms for…
Descriptors: English (Second Language), Second Language Instruction, Teacher Attitudes, Multiple Choice Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Lissitz, Robert W.; Hou, Xiaodong; Slater, Sharon Cadman – Journal of Applied Testing Technology, 2012
This article investigates several questions regarding the impact of different item formats on measurement characteristics. Constructed response (CR) items and multiple choice (MC) items obviously differ in their formats and in the resources needed to score them. As such, they have been the subject of considerable discussion regarding the impact of…
Descriptors: Computer Assisted Testing, Scoring, Evaluation Problems, Psychometrics
Peer reviewed Peer reviewed
Direct linkDirect link
Ko, C. C.; Cheng, C. D. – Computers & Education, 2008
Electronic examination systems, which include Internet-based system, require extremely complicated installation, configuration and maintenance of software as well as hardware. In this paper, we present the design and development of a flexible, easy-to-use and secure examination system (e-Test), in which any commonly used computer can be used as a…
Descriptors: Computer Assisted Testing, Computers, Program Effectiveness, Examiners
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Pommerich, Mary – Journal of Technology, Learning, and Assessment, 2004
As testing moves from paper-and-pencil administration toward computerized administration, how to present tests on a computer screen becomes an important concern. Of particular concern are tests that contain necessary information that cannot be displayed on screen all at once for an item. Ideally, the method of presentation should not interfere…
Descriptors: Test Content, Computer Assisted Testing, Multiple Choice Tests, Computer Interfaces
Peer reviewed Peer reviewed
Direct linkDirect link
Park, Jooyong – Journal of Educational Psychology, 2005
A new computerized testing system, which facilitates the use of short-answer-type testing, has been developed. In this system, the question of a multiple-choice problem is presented first, and the options appear briefly on the request of the test taker. The crux of this manipulation is to force students to solve the problem as if they were solving…
Descriptors: Experimental Groups, Control Groups, Computer Assisted Testing, Multiple Choice Tests