Publication Date
In 2025 | 26 |
Since 2024 | 102 |
Since 2021 (last 5 years) | 359 |
Descriptor
Source
Author
Dogan, Nuri | 3 |
Lee, Senyung | 3 |
O'Grady, Stefan | 3 |
Selcuk Acar | 3 |
Shin, Sun-Young | 3 |
Abdullah Al Fraidan | 2 |
Al-Jarf, Reima | 2 |
Baral, Sami | 2 |
Ben Backes | 2 |
Botelho, Anthony | 2 |
Brian E. Clauser | 2 |
More ▼ |
Publication Type
Reports - Research | 359 |
Journal Articles | 336 |
Tests/Questionnaires | 27 |
Speeches/Meeting Papers | 8 |
Information Analyses | 6 |
Collected Works - Serial | 1 |
Numerical/Quantitative Data | 1 |
Opinion Papers | 1 |
Education Level
Audience
Location
Turkey | 27 |
Germany | 13 |
United Kingdom | 12 |
China | 8 |
Indonesia | 8 |
Iran | 8 |
Japan | 8 |
Saudi Arabia | 7 |
United Kingdom (England) | 7 |
Australia | 6 |
Canada | 5 |
More ▼ |
Laws, Policies, & Programs
Head Start | 1 |
Assessments and Surveys
What Works Clearinghouse Rating
Tom Benton – Practical Assessment, Research & Evaluation, 2025
This paper proposes an extension of linear equating that may be useful in one of two fairly common assessment scenarios. One is where different students have taken different combinations of test forms. This might occur, for example, where students have some free choice over the exam papers they take within a particular qualification. In this…
Descriptors: Equated Scores, Test Format, Test Items, Computation
Meltem Acar Güvendir; Seda Donat Bacioglu; Hasan Özgür; Sefa Uyanik; Fatmagül Gürbüz Akçay; Emre Güvendir – International Journal of Psychology and Educational Studies, 2025
Different types of test items influence students' test anxiety, and physiological measures such as heart rate provide a means of measuring this anxiety. This study aimed to explore the connection between test anxiety and examination item formats. It centered on 20 junior university students in Western Türkiye. The research monitored students'…
Descriptors: Foreign Countries, Test Anxiety, Measurement Techniques, Physiology
Sohee Kim; Ki Lynn Cole – International Journal of Testing, 2025
This study conducted a comprehensive comparison of Item Response Theory (IRT) linking methods applied to a bifactor model, examining their performance on both multiple choice (MC) and mixed format tests within the common item nonequivalent group design framework. Four distinct multidimensional IRT linking approaches were explored, consisting of…
Descriptors: Item Response Theory, Comparative Analysis, Models, Item Analysis
Stefanie A. Wind; Yuan Ge – Measurement: Interdisciplinary Research and Perspectives, 2024
Mixed-format assessments made up of multiple-choice (MC) items and constructed response (CR) items that are scored using rater judgments include unique psychometric considerations. When these item types are combined to estimate examinee achievement, information about the psychometric quality of each component can depend on that of the other. For…
Descriptors: Interrater Reliability, Test Bias, Multiple Choice Tests, Responses
Janet Mee; Ravi Pandian; Justin Wolczynski; Amy Morales; Miguel Paniagua; Polina Harik; Peter Baldwin; Brian E. Clauser – Advances in Health Sciences Education, 2024
Recent advances in automated scoring technology have made it practical to replace multiple-choice questions (MCQs) with short-answer questions (SAQs) in large-scale, high-stakes assessments. However, most previous research comparing these formats has used small examinee samples testing under low-stakes conditions. Additionally, previous studies…
Descriptors: Multiple Choice Tests, High Stakes Tests, Test Format, Test Items
Jonathan Hoseana; Andy Leonardo Louismono; Oriza Stepanus – International Journal of Mathematical Education in Science and Technology, 2025
We describe and evaluate a method to mitigate unwanted student collaborations in assessments, which we recently implemented in a second-year undergraduate mathematics module. The method requires a list of specific pairs of students to be prevented from collaborating, which we constructed based on the results of previous assessments. We converted…
Descriptors: Graphs, Color, College Mathematics, Undergraduate Students
Monica Casella; Pasquale Dolce; Michela Ponticorvo; Nicola Milano; Davide Marocco – Educational and Psychological Measurement, 2024
Short-form development is an important topic in psychometric research, which requires researchers to face methodological choices at different steps. The statistical techniques traditionally used for shortening tests, which belong to the so-called exploratory model, make assumptions not always verified in psychological data. This article proposes a…
Descriptors: Artificial Intelligence, Test Construction, Test Format, Psychometrics
Li Zhao; Junjie Peng; Shiqi Ke; Kang Lee – Educational Psychology Review, 2024
Unproctored and teacher-proctored exams have been widely used to prevent cheating at many universities worldwide. However, no empirical studies have directly compared their effectiveness in promoting academic integrity in actual exams. To address this significant gap, in four preregistered field studies, we examined the effectiveness of…
Descriptors: Supervision, Tests, Testing, Integrity
Brian E. Clauser; Victoria Yaneva; Peter Baldwin; Le An Ha; Janet Mee – Applied Measurement in Education, 2024
Multiple-choice questions have become ubiquitous in educational measurement because the format allows for efficient and accurate scoring. Nonetheless, there remains continued interest in constructed-response formats. This interest has driven efforts to develop computer-based scoring procedures that can accurately and efficiently score these items.…
Descriptors: Computer Uses in Education, Artificial Intelligence, Scoring, Responses
Jiawei Xiong; George Engelhard; Allan S. Cohen – Measurement: Interdisciplinary Research and Perspectives, 2025
It is common to find mixed-format data results from the use of both multiple-choice (MC) and constructed-response (CR) questions on assessments. Dealing with these mixed response types involves understanding what the assessment is measuring, and the use of suitable measurement models to estimate latent abilities. Past research in educational…
Descriptors: Responses, Test Items, Test Format, Grade 8
Selcuk Acar; Peter Organisciak; Denis Dumas – Journal of Creative Behavior, 2025
In this three-study investigation, we applied various approaches to score drawings created in response to both Form A and Form B of the Torrance Tests of Creative Thinking-Figural (broadly TTCT-F) as well as the Multi-Trial Creative Ideation task (MTCI). We focused on TTCT-F in Study 1, and utilizing a random forest classifier, we achieved 79% and…
Descriptors: Scoring, Computer Assisted Testing, Models, Correlation
Guadalupe Elizabeth Morales-Martinez; Ricardo Jesus Villarreal-Lozano; Maria Isolde Hedlefs-Aguilar – International Journal of Emotional Education, 2025
This research study explored the systematic thinking modes underlying test anxiety in 706 engineering students through an experiment centred on the cognitive algebra paradigm. The participants had to read 36 experimental scenarios that narrated an imaginary academic assessment situation one by one and then judge the level of anxiety they…
Descriptors: Engineering Education, Cognitive Style, College Students, Student Attitudes
Jessie Leigh Nielsen; Rikke Vang Christensen; Mads Poulsen – Journal of Research in Reading, 2024
Background: Studies of syntactic comprehension and reading comprehension use a wide range of syntactic comprehension tests that vary considerably in format. The goal of this study was to examine to which extent different formats of syntactic comprehension tests measure the same construct. Methods: Sixty-nine Grade 4 students completed multiple…
Descriptors: Syntax, Reading Comprehension, Comparative Analysis, Reading Tests
Ting Sun; Stella Yun Kim – Educational and Psychological Measurement, 2024
Equating is a statistical procedure used to adjust for the difference in form difficulty such that scores on those forms can be used and interpreted comparably. In practice, however, equating methods are often implemented without considering the extent to which two forms differ in difficulty. The study aims to examine the effect of the magnitude…
Descriptors: Difficulty Level, Data Interpretation, Equated Scores, High School Students
Skylar J. Laursen; Dorina Sluka; Chris M. Fiacconi – Metacognition and Learning, 2024
Previous literature suggests learners can adjust their encoding strategies to match the demands of the expected test format. However, it is unclear whether other forms of metacognitive control, namely, study time allocation and restudy selection, are also sensitive to expected test format. Across four experiments we examined whether learners…
Descriptors: Test Format, Test Wiseness, Metacognition, Study Habits