NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Location
Germany11
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Showing all 11 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Martin Steinbach; Carolin Eitemüller; Marc Rodemer; Maik Walpuski – International Journal of Science Education, 2025
The intricate relationship between representational competence and content knowledge in organic chemistry has been widely debated, and the ways in which representations contribute to task difficulty, particularly in assessment, remain unclear. This paper presents a multiple-choice test instrument for assessing individuals' knowledge of fundamental…
Descriptors: Organic Chemistry, Difficulty Level, Multiple Choice Tests, Fundamental Concepts
Peer reviewed Peer reviewed
Direct linkDirect link
Tobias Lieberei; Leroy Großmann; Virginia Deborah Elaine Welter; Dirk Krüger; Moritz Krell – Research in Science Education, 2025
The use of multiple-choice (MC) instruments to assess pedagogical content knowledge (PCK) has advantages in terms of test economy and objectivity, but it also poses challenges, for example, in terms of adequately capturing the intended construct. To help address these challenges, we developed and evaluated a new instrument to assess science…
Descriptors: Multiple Choice Tests, Pedagogical Content Knowledge, Science Teachers, Logical Thinking
Peer reviewed Peer reviewed
Direct linkDirect link
Katrin Klingbeil; Fabian Rösken; Bärbel Barzel; Florian Schacht; Kaye Stacey; Vicki Steinle; Daniel Thurm – ZDM: Mathematics Education, 2024
Assessing students' (mis)conceptions is a challenging task for teachers as well as for researchers. While individual assessment, for example through interviews, can provide deep insights into students' thinking, this is very time-consuming and therefore not feasible for whole classes or even larger settings. For those settings, automatically…
Descriptors: Multiple Choice Tests, Formative Evaluation, Mathematics Tests, Misconceptions
Peer reviewed Peer reviewed
Direct linkDirect link
Papenberg, Martin; Diedenhofen, Birk; Musch, Jochen – Journal of Experimental Education, 2021
Testwiseness may introduce construct-irrelevant variance to multiple-choice test scores. Presenting response options sequentially has been proposed as a potential solution to this problem. In an experimental validation, we determined the psychometric properties of a test based on the sequential presentation of response options. We created a strong…
Descriptors: Test Wiseness, Test Validity, Test Reliability, Multiple Choice Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Wörner, Salome; Becker, Sebastian; Küchemann, Stefan; Scheiter, Katharina; Kuhn, Jochen – Physical Review Physics Education Research, 2022
Optics is a core field in the curricula of secondary physics education. In this study, we present the development and validation of a test instrument in the field of optics, the ray optics in converging lenses concept inventory (ROC-CI). It was developed for and validated with middle school students, but can also be adapted for the use in higher…
Descriptors: Optics, Physics, Science Instruction, Concept Formation
Peer reviewed Peer reviewed
Direct linkDirect link
Papenberg, Martin; Musch, Jochen – Applied Measurement in Education, 2017
In multiple-choice tests, the quality of distractors may be more important than their number. We therefore examined the joint influence of distractor quality and quantity on test functioning by providing a sample of 5,793 participants with five parallel test sets consisting of items that differed in the number and quality of distractors.…
Descriptors: Multiple Choice Tests, Test Items, Test Validity, Test Reliability
Peer reviewed Peer reviewed
Direct linkDirect link
Fiedler, Daniela; Tröbst, Steffen; Harms, Ute – CBE - Life Sciences Education, 2017
Students of all ages face severe conceptual difficulties regarding key aspects of evolution-- the central, unifying, and overarching theme in biology. Aspects strongly related to abstract "threshold" concepts like randomness and probability appear to pose particular difficulties. A further problem is the lack of an appropriate instrument…
Descriptors: College Students, Concept Formation, Probability, Evolution
Peer reviewed Peer reviewed
Direct linkDirect link
Krell, Moritz – Cogent Education, 2017
This study evaluates a 12-item instrument for subjective measurement of mental load (ML) and mental effort (ME) by analysing different sources of validity evidence. The findings of an expert judgement (N = 8) provide "evidence based on test content" that the formulation of the items corresponds to the meaning of ML and ME. An empirical…
Descriptors: Cognitive Processes, Test Validity, Secondary School Students, Multiple Choice Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Peter, Johannes; Leichner, Nikolas; Mayer, Anne-Kathrin; Krampen, Günter – Psychology Learning and Teaching, 2015
This paper reports the development of a fixed-choice test for the assessment of basic knowledge in psychology, for use with undergraduate as well as graduate students. Test content is selected based on a core concepts approach and includes a sample of concepts which are indexed most frequently in common introductory psychology textbooks. In a…
Descriptors: Tests, Psychology, Knowledge Level, Scores
Peer reviewed Peer reviewed
Direct linkDirect link
Schwichow, Martin; Christoph, Simon; Boone, William J.; Härtig, Hendrik – International Journal of Science Education, 2016
The so-called control-of-variables strategy (CVS) incorporates the important scientific reasoning skills of designing controlled experiments and interpreting experimental outcomes. As CVS is a prominent component of science standards appropriate assessment instruments are required to measure these scientific reasoning skills and to evaluate the…
Descriptors: Thinking Skills, Science Instruction, Science Experiments, Science Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Sparfeldt, Jorn R.; Kimmel, Rumena; Lowenkamp, Lena; Steingraber, Antje; Rost, Detlef H. – Educational Assessment, 2012
Multiple-choice (MC) reading comprehension test items comprise three components: text passage, questions about the text, and MC answers. The construct validity of this format has been repeatedly criticized. In three between-subjects experiments, fourth graders (N[subscript 1] = 230, N[subscript 2] = 340, N[subscript 3] = 194) worked on three…
Descriptors: Test Items, Reading Comprehension, Construct Validity, Grade 4