NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)1
Since 2006 (last 20 years)8
Audience
Location
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 37 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Mislevy, Robert J. – Journal of Educational Measurement, 2016
Validity is the sine qua non of properties of educational assessment. While a theory of validity and a practical framework for validation has emerged over the past decades, most of the discussion has addressed familiar forms of assessment and psychological framings. Advances in digital technologies and in cognitive and social psychology have…
Descriptors: Test Validity, Technology, Cognitive Psychology, Social Psychology
Peer reviewed Peer reviewed
Direct linkDirect link
Mislevy, Robert J. – Measurement: Interdisciplinary Research and Perspectives, 2013
Measurement is a semantic frame, a constellation of relationships and concepts that correspond to recurring patterns in human activity, highlighting typical roles, processes, and viewpoints (e.g., the "commercial event") but not others. One uses semantic frames to reason about unique and complex situations--sometimes intuitively, sometimes…
Descriptors: Educational Assessment, Measurement, Feedback (Response), Evidence
Peer reviewed Peer reviewed
Direct linkDirect link
Mislevy, Robert J. – Teachers College Record, 2014
Background/Context: This article explains the idea of a neopragmatic postmodernist test theory and offers some thoughts about what changing notions concerning the nature of and meanings assigned to knowledge imply for educational assessment, present and future. Purpose: Advances in the learning sciences--particularly situative and sociocognitive…
Descriptors: Test Theory, Postmodernism, Educational Assessment, Educational Trends
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Mislevy, Robert J.; Behrens, John T.; Dicerbo, Kristen E.; Levy, Roy – Journal of Educational Data Mining, 2012
"Evidence-centered design" (ECD) is a comprehensive framework for describing the conceptual, computational and inferential elements of educational assessment. It emphasizes the importance of articulating inferences one wants to make and the evidence needed to support those inferences. At first blush, ECD and "educational data…
Descriptors: Educational Assessment, Psychometrics, Evidence, Computer Games
Mislevy, Robert J. – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2009
From a contemporary perspective on cognition, the between-persons variables in trait-based arguments in educational assessment are absurd over-simplifications. Yet, for a wide range of applications, they work. Rather than seeing such variables as independently-existing characteristics of people, we can view them as summaries of patterns in…
Descriptors: Test Validity, Educational Assessment, Item Response Theory, Logical Thinking
Peer reviewed Peer reviewed
Direct linkDirect link
Mislevy, Robert J. – Research Papers in Education, 2010
An educational assessment embodies an argument from a handful of observations of what students say, do or make in a handful of particular circumstances, to what they know or can do in what kinds of situations more broadly. This article discusses ways in which research into the nature and development of expertise can help assessment designers…
Descriptors: Educational Assessment, Test Construction, Expertise, Research
Behrens, John T.; Mislevy, Robert J.; DiCerbo, Kristen E.; Levy, Roy – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2010
The world in which learning and assessment must take place is rapidly changing. The digital revolution has created a vast space of interconnected information, communication, and interaction. Functioning effectively in this environment requires so-called 21st century skills such as technological fluency, complex problem solving, and the ability to…
Descriptors: Evidence, Student Evaluation, Educational Assessment, Influence of Technology
Mislevy, Robert J.; Behrens, John T.; Bennett, Randy E.; Demark, Sarah F.; Frezzo, Dennis C.; Levy, Roy; Robinson, Daniel H.; Rutstein, Daisy Wise; Shute, Valerie J.; Stanley, Ken; Winters, Fielding I. – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2007
People use external knowledge representations (EKRs) to identify, depict, transform, store, share, and archive information. Learning how to work with EKRs is central to becoming proficient in virtually every discipline. As such, EKRs play central roles in curriculum, instruction, and assessment. Five key roles of EKRs in educational assessment are…
Descriptors: Educational Assessment, Computer Networks, Test Construction, Computer Assisted Testing
Mislevy, Robert J.; Steinberg, Linda S.; Almond, Russell G. – 2003
In educational assessment, educators observe what students say, do, or make in a few particular circumstances and attempt to infer what they know, can do, or have accomplished more generally. A web of inference connects the two. Some connections depend on theories and experience concerning the targeted knowledge in the domain, how it is acquired,…
Descriptors: Educational Assessment, Elementary Secondary Education, Inferences, Models
Mislevy, Robert J. – 2003
Educational assessment is reasoning from observations of what students do or make in a handful of particular circumstances, to what they know or can do more broadly. Practice has changed a great deal over the past century, in response to evolving conceptions of knowledge and its acquisition, views of schooling and its purposes, and technologies…
Descriptors: Educational Assessment, Educational Philosophy, Elementary Secondary Education, Student Evaluation
Mislevy, Robert J.; Steinberg, Linda S.; Almond, Russell G. – 1999
Tasks are the most visible element in an educational assessment. Their purpose, however, is to provide evidence about targets of inference that cannot be directly seen at all: what examinees know and can do, more broadly conceived than can be observed in the context of any particular set of tasks. This paper concerns issues in an assessment design…
Descriptors: Educational Assessment, Evaluation Methods, Higher Education, Models
Mislevy, Robert J.; Wilson, Mark R.; Ercikan, Kadriye; Chudowsky, Naomi – 2002
In educational assessment, what students say, do, and sometimes make is observed, and assessors attempt to infer what students know, can do, or have accomplished more generally. Some links in the chain of inference depend on statistical models and probability-based reasoning, and it is with these links that terms such as validity, reliability, and…
Descriptors: Data Analysis, Data Collection, Educational Assessment, Inferences
Mislevy, Robert J.; Steinberg, Linda S.; Almond, Russell G.; Haertel, Geneva D.; Penuel, William R. – 2001
Advances in cognitive psychology deepen the understanding of how students gain and use knowledge. Advances in technology make it possible to capture more complex performances in assessment settings, by including, for example, simulation, interactivity, collaboration, and constructed response. The challenge is in knowing just how to put this new…
Descriptors: Cognitive Psychology, Educational Assessment, Educational Improvement, Educational Technology
Peer reviewed Peer reviewed
Mislevy, Robert J.; Steinberg, Linda S.; Breyer, F. Jay; Almond, Russell G.; Johnson, Lynn – Applied Measurement in Education, 2002
Presents a design framework that incorporates integrated structures for modeling knowledge and skills, designing tasks, and extracting and synthesizing evidence. Illustrates these ideas in the context of a project that assesses problem solving in dental hygiene through computer-based simulations. (SLD)
Descriptors: Computer Simulation, Dental Hygienists, Educational Assessment, Evaluation Utilization
Peer reviewed Peer reviewed
Mislevy, Robert J. – Journal of Educational Statistics, 1983
The most familiar models of item response theory are defined at the level of individual subjects. It is possible, however, to define such models for groups of subjects. This paper discusses group-level item response models, their uses, and their relationships to subject-level models. (Author/JKS)
Descriptors: Educational Assessment, Estimation (Mathematics), Group Testing, Item Sampling
Previous Page | Next Page ยป
Pages: 1  |  2  |  3