NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
No Child Left Behind Act 20011
What Works Clearinghouse Rating
Showing 1 to 15 of 172 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
E.?B. Merki; S.?I. Hofer; A. Vaterlaus; A. Lichtenberger – Physical Review Physics Education Research, 2025
When describing motion in physics, the selection of a frame of reference is crucial. The graph of a moving object can look quite different based on the frame of reference. In recent years, various tests have been developed to assess the interpretation of kinematic graphs, but none of these tests have specifically addressed differences in reference…
Descriptors: Graphs, Motion, Physics, Secondary School Students
Peer reviewed Peer reviewed
Direct linkDirect link
Martin Steinbach; Carolin Eitemüller; Marc Rodemer; Maik Walpuski – International Journal of Science Education, 2025
The intricate relationship between representational competence and content knowledge in organic chemistry has been widely debated, and the ways in which representations contribute to task difficulty, particularly in assessment, remain unclear. This paper presents a multiple-choice test instrument for assessing individuals' knowledge of fundamental…
Descriptors: Organic Chemistry, Difficulty Level, Multiple Choice Tests, Fundamental Concepts
Peer reviewed Peer reviewed
Direct linkDirect link
Grace C. Tetschner; Sachin Nedungadi – Chemistry Education Research and Practice, 2025
Many undergraduate chemistry students hold alternate conceptions related to resonance--an important and fundamental topic of organic chemistry. To help address these alternate conceptions, an organic chemistry instructor could administer the resonance concept inventory (RCI), which is a multiple-choice assessment that was designed to identify…
Descriptors: Scientific Concepts, Concept Formation, Item Response Theory, Scores
Thompson, Kathryn N. – ProQuest LLC, 2023
It is imperative to collect validity evidence prior to interpreting and using test scores. During the process of collecting validity evidence, test developers should consider whether test scores are contaminated by sources of extraneous information. This is referred to as construct irrelevant variance, or the "degree to which test scores are…
Descriptors: Test Wiseness, Test Items, Item Response Theory, Scores
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Mehmet Kanik – International Journal of Assessment Tools in Education, 2024
ChatGPT has surged interest to cause people to look for its use in different tasks. However, before allowing it to replace humans, its capabilities should be investigated. As ChatGPT has potential for use in testing and assessment, this study aims to investigate the questions generated by ChatGPT by comparing them to those written by a course…
Descriptors: Artificial Intelligence, Testing, Multiple Choice Tests, Test Construction
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Arandha May Rachmawati; Agus Widyantoro – English Language Teaching Educational Journal, 2025
This study aims to evaluate the quality of English reading comprehension test instruments used in informal learning, especially as English literacy tests. With a quantitative approach, the analysis was carried out using the Rasch model through the Quest program on 30 multiple-choice questions given to 30 grade IX students from informal educational…
Descriptors: Item Response Theory, Reading Tests, Reading Comprehension, English (Second Language)
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Acikgul, Kubra; Sad, Suleyman Nihat; Altay, Bilal – International Journal of Assessment Tools in Education, 2023
This study aimed to develop a useful test to measure university students' spatial abilities validly and reliably. Following a sequential explanatory mixed methods research design, first, qualitative methods were used to develop the trial items for the test; next, the psychometric properties of the test were analyzed through quantitative methods…
Descriptors: Spatial Ability, Scores, Multiple Choice Tests, Test Validity
Peer reviewed Peer reviewed
Direct linkDirect link
Julie Dickson; Darren J. Shaw; Andrew Gardiner; Susan Rhind – Anatomical Sciences Education, 2024
Limited research has been conducted on the spatial ability of veterinary students and how this is evaluated within anatomy assessments. This study describes the creation and evaluation of a split design multiple-choice question (MCQ) assessment (totaling 30 questions divided into 15 non-spatial MCQs and 15 spatial MCQs). Two cohorts were tested,…
Descriptors: Anatomy, Spatial Ability, Multiple Choice Tests, Factor Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Brennan, Robert L.; Kim, Stella Y.; Lee, Won-Chan – Educational and Psychological Measurement, 2022
This article extends multivariate generalizability theory (MGT) to tests with different random-effects designs for each level of a fixed facet. There are numerous situations in which the design of a test and the resulting data structure are not definable by a single design. One example is mixed-format tests that are composed of multiple-choice and…
Descriptors: Multivariate Analysis, Generalizability Theory, Multiple Choice Tests, Test Construction
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Anatri Desstya; Ika Candra Sayekti; Muhammad Abduh; Sukartono – Journal of Turkish Science Education, 2025
This study aimed to develop a standardised instrument for diagnosing science misconceptions in primary school children. Following a developmental research approach using the 4-D model (Define, Design, Develop, Disseminate), 100 four-tier multiple choice items were constructed. Content validity was established through expert evaluation by six…
Descriptors: Test Construction, Science Tests, Science Instruction, Diagnostic Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Al-zboon, Habis Saad; Alrekebat, Amjad Farhan – International Journal of Higher Education, 2021
This study aims at identifying the effect of multiple-choice test items' difficulty degree on the reliability coefficient and the standard error of measurement depending on the item response theory IRT. To achieve the objectives of the study, (WinGen3) software was used to generate the IRT parameters (difficulty, discrimination, guessing) for four…
Descriptors: Multiple Choice Tests, Test Items, Difficulty Level, Error of Measurement
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Büsra Kilinç; Mehmet Diyaddin Yasar – Science Insights Education Frontiers, 2024
In this study, it was aimed to develop an achievement test taking into account the subject acquisitions of the sound and properties unit in the sixth-grade science course. In the test development phase, firstly, literature review for the study was conducted. Then, 30 multiple choice questions in align with the subject acquisition in the 2018…
Descriptors: Science Tests, Test Construction, Grade 6, Science Instruction
Peer reviewed Peer reviewed
Direct linkDirect link
Eder Hernandez; Esmeralda Campos; Pablo Barniol; Genaro Zavala – Physical Review Physics Education Research, 2025
This study presents the development and validation of a novel multiple-choice test designed to assess university students' conceptual understanding of electric field, force, and flux. The test of understanding of electric field, force, and flux was constructed based on the results of previous studies using a phenomenographic approach to classify…
Descriptors: Physics, Scientific Concepts, Science Tests, Multiple Choice Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Güntay Tasçi – Science Insights Education Frontiers, 2024
The present study has aimed to develop and validate a protein concept inventory (PCI) consisting of 25 multiple-choice (MC) questions to assess students' understanding of protein, which is a fundamental concept across different biology disciplines. The development process of the PCI involved a literature review to identify protein-related content,…
Descriptors: Science Instruction, Science Tests, Multiple Choice Tests, Biology
Peer reviewed Peer reviewed
Direct linkDirect link
Wang, Yu; Chiu, Chia-Yi; Köhn, Hans Friedrich – Journal of Educational and Behavioral Statistics, 2023
The multiple-choice (MC) item format has been widely used in educational assessments across diverse content domains. MC items purportedly allow for collecting richer diagnostic information. The effectiveness and economy of administering MC items may have further contributed to their popularity not just in educational assessment. The MC item format…
Descriptors: Multiple Choice Tests, Nonparametric Statistics, Test Format, Educational Assessment
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9  |  10  |  11  |  12