NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 20 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Tugba Uygun; Pinar Guner; Irfan Simsek – International Journal of Mathematical Education in Science and Technology, 2024
This study was conducted to reveal potential sources of students' difficulty and misconceptions about geometrical concepts with the help of eye tracking. In this study, the students' geometrical misconceptions were explored by answering the questions on the geometry test prepared based on the literature and test-taking processes and represented…
Descriptors: Eye Movements, Geometric Concepts, Mathematics Instruction, Misconceptions
Peer reviewed Peer reviewed
Direct linkDirect link
Dirkx, K. J. H.; Skuballa, I.; Manastirean-Zijlstra, C. S.; Jarodzka, H. – Instructional Science: An International Journal of the Learning Sciences, 2021
The use of computer-based tests (CBTs), for both formative and summative purposes, has greatly increased over the past years. One major advantage of CBTs is the easy integration of multimedia. It is unclear, though, how to design such CBT environments with multimedia. The purpose of the current study was to examine whether guidelines for designing…
Descriptors: Test Construction, Computer Assisted Testing, Multimedia Instruction, Eye Movements
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Asquith, Steven – TESL-EJ, 2022
Although an accurate measure of vocabulary size is integral to understanding the proficiency of language learners, the validity of multiple-choice (M/C) vocabulary tests to determine this has been questioned due to users guessing correct answers which inflates scores. In this paper the nature of guessing and partial knowledge used when taking the…
Descriptors: Guessing (Tests), English (Second Language), Second Language Learning, Language Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Smith, Mark; Breakstone, Joel; Wineburg, Sam – Cognition and Instruction, 2019
This article reports a validity study of History Assessments of Thinking (HATs), which are short, constructed-response assessments of historical thinking. In particular, this study focuses on aspects of cognitive validity, which is an examination of whether assessments tap the intended constructs. Think-aloud interviews with 26 high school…
Descriptors: History, History Instruction, Thinking Skills, Multiple Choice Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Lehane, Paula; Scully, Darina; O'Leary, Michael – Irish Educational Studies, 2022
In line with the widespread proliferation of digital technology in everyday life, many countries are now beginning to use computer-based exams (CBEs) in their post-primary education systems. To ensure that these CBEs are delivered in a manner that preserves their fairness, validity, utility and credibility, several factors pertaining to their…
Descriptors: Computer Assisted Testing, Secondary School Students, Culture Fair Tests, Test Validity
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Scully, Darina – Practical Assessment, Research & Evaluation, 2017
Across education, certification and licensure, there are repeated calls for the development of assessments that target "higher-order thinking," as opposed to mere recall of facts. A common assumption is that this necessitates the use of constructed response or essay-style test questions; however, empirical evidence suggests that this may…
Descriptors: Test Construction, Test Items, Multiple Choice Tests, Thinking Skills
Peer reviewed Peer reviewed
Direct linkDirect link
Mix, Daniel F.; Tao, Shuqin – AERA Online Paper Repository, 2017
Purposes: This study uses think-alouds and cognitive interviews to provide validity evidence for an online formative assessment--i-Ready Standards Mastery (iSM) mini-assessments--which involves a heavy use of innovative items. iSM mini-assessments are intended to help teachers determine student understanding of each of the on-grade-level Common…
Descriptors: Formative Evaluation, Computer Assisted Testing, Test Validity, Student Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Vorstenbosch, Marc A. T. M.; Bouter, Shifra T.; van den Hurk, Marianne M.; Kooloos, Jan G. M.; Bolhuis, Sanneke M.; Laan, Roland F. J. M. – Anatomical Sciences Education, 2014
Assessment is an important aspect of medical education because it tests students' competence and motivates them to study. Various assessment methods, with and without images, are used in the study of anatomy. In this study, we investigated the use of extended matching questions (EMQs). To gain insight into the influence of images on the…
Descriptors: Student Evaluation, Anatomy, Medical Students, Visual Aids
Warner, Zachary B. – ProQuest LLC, 2013
This study compared an expert-based cognitive model of domain mastery with student-based cognitive models of task performance for Integrated Algebra. Interpretations of student test results are limited by experts' hypotheses of how students interact with the items. In reality, the cognitive processes that students use to solve each item may be…
Descriptors: Comparative Analysis, Algebra, Test Results, Measurement
Peer reviewed Peer reviewed
Direct linkDirect link
Bonner, Sarah M.; D'Agostino, Jerome V. – Applied Measurement in Education, 2012
We investigated examinees' cognitive processes while they solved selected items from the Multistate Bar Exam (MBE), a high-stakes professional certification examination. We focused on ascertaining those mental processes most frequently used by examinees, and the most common types of errors in their thinking. We compared the relationships between…
Descriptors: Cognitive Processes, Test Items, Problem Solving, Thinking Skills
Peer reviewed Peer reviewed
Direct linkDirect link
Breakstone, Joel – Theory and Research in Social Education, 2014
This article considers the design process for new formative history assessments. Over the course of 3 years, my colleagues from the Stanford History Education Group and I designed, piloted, and revised dozens of "History Assessments of Thinking" (HATs). As we created HATs, we sought to gather information about their cognitive validity,…
Descriptors: History Instruction, Formative Evaluation, Tests, Correlation
Peer reviewed Peer reviewed
Direct linkDirect link
Bonner, Sarah M. – Journal of Experimental Education, 2013
Although test scores from similar tests in multiple choice and constructed response formats are highly correlated, equivalence in rankings may mask differences in substantive strategy use. The author used an experimental design and participant think-alouds to explore cognitive processes in mathematical problem solving among undergraduate examinees…
Descriptors: Scores, Multiple Choice Tests, Correlation, Protocol Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Leighton, Jacqueline P.; Cui, Ying; Cor, M. Ken – Applied Measurement in Education, 2009
The objective of the present investigation was to compare the adequacy of two cognitive models for predicting examinee performance on a sample of algebra I and II items from the March 2005 administration of the SAT[TM]. The two models included one generated from verbal reports provided by 21 examinees as they solved the SAT[TM] items, and the…
Descriptors: Test Items, Inferences, Cognitive Ability, Prediction
O'Shea, Mary B. – ProQuest LLC, 2010
Although much is known about how students perform on standardized tests, little research exists concerning how students think and process while taking such tests. This mixed methods action research study was designed to investigate if a constructivist approach to test preparation could yield improved results for 37 English language arts freshmen…
Descriptors: Test Preparation, Test Items, Statistical Analysis, Grade 9
Gierl, Mark J.; Leighton, Jacqueline P.; Wang, Changjiang; Zhou, Jiawen; Gokiert, Rebecca; Tan, Adele – College Board, 2009
The purpose of the study is to present research focused on validating the four algebra cognitive models in Gierl, Wang, et al., using student response data collected with protocol analysis methods to evaluate the knowledge structures and processing skills used by a sample of SAT test takers.
Descriptors: Algebra, Mathematics Tests, College Entrance Examinations, Student Attitudes
Previous Page | Next Page »
Pages: 1  |  2