Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 2 |
| Since 2007 (last 20 years) | 3 |
Descriptor
Author
| Abdullah, Ain Nadzimah | 1 |
| Anuardi, Muhammad Nur Adilin… | 1 |
| Heng, Chan Swee | 1 |
| Lee, Ng Wen | 1 |
| Patra, Rakesh | 1 |
| Rybanov, Alexander… | 1 |
| Saha, Sujan Kumar | 1 |
| Shamsuddin, Wan Noor Farah Wan | 1 |
| Wei, Lim Chia | 1 |
Publication Type
| Journal Articles | 3 |
| Reports - Descriptive | 1 |
| Reports - Evaluative | 1 |
| Reports - Research | 1 |
| Tests/Questionnaires | 1 |
Education Level
| Higher Education | 1 |
| Postsecondary Education | 1 |
Audience
Location
| Malaysia | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
A Hybrid Approach for Automatic Generation of Named Entity Distractors for Multiple Choice Questions
Patra, Rakesh; Saha, Sujan Kumar – Education and Information Technologies, 2019
Assessment plays an important role in learning and Multiple Choice Questions (MCQs) are quite popular in large-scale evaluations. Technology-enabled learning necessitates a smart assessment. Therefore, automatic MCQ generation became increasingly popular in the last two decades. Despite a large amount of research effort, system generated MCQs are…
Descriptors: Multiple Choice Tests, High Stakes Tests, Semantics, Evaluation Methods
Lee, Ng Wen; Shamsuddin, Wan Noor Farah Wan; Wei, Lim Chia; Anuardi, Muhammad Nur Adilin Mohd; Heng, Chan Swee; Abdullah, Ain Nadzimah – International Journal of Evaluation and Research in Education, 2021
Criticisms on multiple choice questions (MCQs) include the possibility of students answering MCQs correctly by guessing, and MCQs generally are said to fall short in cultivating independent learning skills, such as taking charge of their learning goals. Countering these common concerns, this research used online MCQ exercises with multiple…
Descriptors: Multiple Choice Tests, Test Items, Self Management, Independent Study
Rybanov, Alexander Aleksandrovich – Turkish Online Journal of Distance Education, 2013
Is offered the set of criteria for assessing efficiency of the process forming the answers to multiple-choice test items. To increase accuracy of computer-assisted testing results, it is suggested to assess dynamics of the process of forming the final answer using the following factors: loss of time factor and correct choice factor. The model…
Descriptors: Evaluation Criteria, Efficiency, Multiple Choice Tests, Test Items

Peer reviewed
Direct link
