NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 202526
Publication Type
Journal Articles26
Reports - Research26
Tests/Questionnaires2
Information Analyses1
Audience
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 26 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Tom Benton – Practical Assessment, Research & Evaluation, 2025
This paper proposes an extension of linear equating that may be useful in one of two fairly common assessment scenarios. One is where different students have taken different combinations of test forms. This might occur, for example, where students have some free choice over the exam papers they take within a particular qualification. In this…
Descriptors: Equated Scores, Test Format, Test Items, Computation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Meltem Acar Güvendir; Seda Donat Bacioglu; Hasan Özgür; Sefa Uyanik; Fatmagül Gürbüz Akçay; Emre Güvendir – International Journal of Psychology and Educational Studies, 2025
Different types of test items influence students' test anxiety, and physiological measures such as heart rate provide a means of measuring this anxiety. This study aimed to explore the connection between test anxiety and examination item formats. It centered on 20 junior university students in Western Türkiye. The research monitored students'…
Descriptors: Foreign Countries, Test Anxiety, Measurement Techniques, Physiology
Peer reviewed Peer reviewed
Direct linkDirect link
Sohee Kim; Ki Lynn Cole – International Journal of Testing, 2025
This study conducted a comprehensive comparison of Item Response Theory (IRT) linking methods applied to a bifactor model, examining their performance on both multiple choice (MC) and mixed format tests within the common item nonequivalent group design framework. Four distinct multidimensional IRT linking approaches were explored, consisting of…
Descriptors: Item Response Theory, Comparative Analysis, Models, Item Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Jonathan Hoseana; Andy Leonardo Louismono; Oriza Stepanus – International Journal of Mathematical Education in Science and Technology, 2025
We describe and evaluate a method to mitigate unwanted student collaborations in assessments, which we recently implemented in a second-year undergraduate mathematics module. The method requires a list of specific pairs of students to be prevented from collaborating, which we constructed based on the results of previous assessments. We converted…
Descriptors: Graphs, Color, College Mathematics, Undergraduate Students
Peer reviewed Peer reviewed
Direct linkDirect link
Jiawei Xiong; George Engelhard; Allan S. Cohen – Measurement: Interdisciplinary Research and Perspectives, 2025
It is common to find mixed-format data results from the use of both multiple-choice (MC) and constructed-response (CR) questions on assessments. Dealing with these mixed response types involves understanding what the assessment is measuring, and the use of suitable measurement models to estimate latent abilities. Past research in educational…
Descriptors: Responses, Test Items, Test Format, Grade 8
Peer reviewed Peer reviewed
Direct linkDirect link
Selcuk Acar; Peter Organisciak; Denis Dumas – Journal of Creative Behavior, 2025
In this three-study investigation, we applied various approaches to score drawings created in response to both Form A and Form B of the Torrance Tests of Creative Thinking-Figural (broadly TTCT-F) as well as the Multi-Trial Creative Ideation task (MTCI). We focused on TTCT-F in Study 1, and utilizing a random forest classifier, we achieved 79% and…
Descriptors: Scoring, Computer Assisted Testing, Models, Correlation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Guadalupe Elizabeth Morales-Martinez; Ricardo Jesus Villarreal-Lozano; Maria Isolde Hedlefs-Aguilar – International Journal of Emotional Education, 2025
This research study explored the systematic thinking modes underlying test anxiety in 706 engineering students through an experiment centred on the cognitive algebra paradigm. The participants had to read 36 experimental scenarios that narrated an imaginary academic assessment situation one by one and then judge the level of anxiety they…
Descriptors: Engineering Education, Cognitive Style, College Students, Student Attitudes
Victoria Crisp; Sylvia Vitello; Abdullah Ali Khan; Heather Mahy; Sarah Hughes – Research Matters, 2025
This research set out to enhance our understanding of the exam techniques and types of written annotations or markings that learners may wish to use to support their thinking when taking digital multiple-choice exams. Additionally, we aimed to further explore issues around the factors that contribute to learners writing less rough work and…
Descriptors: Computer Assisted Testing, Test Format, Multiple Choice Tests, Notetaking
Peer reviewed Peer reviewed
Direct linkDirect link
Emma Pritchard-Rowe; Carmen de Lemos; Katie Howard; Jenny Gibson – Autism: The International Journal of Research and Practice, 2025
Play is often included in autism diagnostic assessments. These tend to focus on 'deficits' and non-autistic interpretation of observable behaviours. In contrast, a neurodiversity-affirmative assessment approach involves centring autistic perspectives and focusing on strengths, differences and needs. Accordingly, this study was designed to focus on…
Descriptors: Foreign Countries, Adults, Autism Spectrum Disorders, Play
Peer reviewed Peer reviewed
Direct linkDirect link
Hung Tan Ha; Duyen Thi Bich Nguyen; Tim Stoeckel – Language Assessment Quarterly, 2025
This article compares two methods for detecting local item dependence (LID): residual correlation examination and Rasch testlet modeling (RTM), in a commonly used 3:6 matching format and an extended matching test (EMT) format. The two formats are hypothesized to facilitate different levels of item dependency due to differences in the number of…
Descriptors: Comparative Analysis, Language Tests, Test Items, Item Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Yusuf Oc; Hela Hassen – Marketing Education Review, 2025
Driven by technological innovations, continuous digital expansion has transformed fundamentally the landscape of modern higher education, leading to discussions about evaluation techniques. The emergence of generative artificial intelligence raises questions about reliability and academic honesty regarding multiple-choice assessments in online…
Descriptors: Higher Education, Multiple Choice Tests, Computer Assisted Testing, Electronic Learning
Peer reviewed Peer reviewed
Direct linkDirect link
Susan Ramlo; Carrie Salmon; Yuan Xue – Journal of College Science Teaching, 2025
Research shows that there are multiple benefits to giving college students oral rather than written exams. However, studies that examine, describe, and differentiate how students view their oral exams were never found in a literature search. The purpose of this study was to use Q methodology [Q] to describe the divergent student views about taking…
Descriptors: Undergraduate Students, Science Instruction, Chemistry, Organic Chemistry
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Robert N. Prince – Numeracy, 2025
One of the effects of the COVID-19 pandemic was the rapid shift to replacing traditional, paper-based tests with their computer-based counterparts. In many cases, these new modes of delivering tests will remain in place for the foreseeable future. In South Africa, the National Benchmark Quantitative Literacy (QL) test was impelled to make this…
Descriptors: Benchmarking, Numeracy, Multiple Literacies, Paper and Pencil Tests
Joanna Williamson – Research Matters, 2025
Teachers, examiners and assessment experts know from experience that some candidates annotate exam questions. "Annotation" includes anything the candidate writes or draws outside of the designated response space, such as underlining, jotting, circling, sketching and calculating. Annotations are of interest because they may evidence…
Descriptors: Mathematics, Tests, Documentation, Secondary Education
Peer reviewed Peer reviewed
Direct linkDirect link
Meaghan McKenna; Hope Gerde; Nicolette Grasley-Boy – Reading and Writing: An Interdisciplinary Journal, 2025
This article describes the development and administration of the "Kindergarten-Second Grade (K-2) Writing Data-Based Decision Making (DBDM) Survey." The "K-2 Writing DBDM Survey" was developed to learn more about current DBDM practices specific to early writing. A total of 376 educational professionals (175 general education…
Descriptors: Writing Evaluation, Writing Instruction, Preschool Teachers, Kindergarten
Previous Page | Next Page »
Pages: 1  |  2