Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 1 |
| Since 2007 (last 20 years) | 7 |
Descriptor
Source
Author
| Baron, Joan Boykoff | 1 |
| Buckendahl, Chad W. | 1 |
| Burket, George | 1 |
| Chen, Li-Sue | 1 |
| Chia, Mike | 1 |
| Craig, Elaine | 1 |
| Dietel, Ronald | 1 |
| Draper, Stephen W. | 1 |
| Dunlap, Joanna | 1 |
| Everson, Howard | 1 |
| Gao, Furong | 1 |
| More ▼ | |
Publication Type
| Reports - Descriptive | 25 |
| Journal Articles | 10 |
| Speeches/Meeting Papers | 3 |
| Guides - Non-Classroom | 1 |
| Numerical/Quantitative Data | 1 |
| Opinion Papers | 1 |
| Reports - Evaluative | 1 |
Education Level
| Higher Education | 4 |
| Elementary Secondary Education | 3 |
| Postsecondary Education | 3 |
| Elementary Education | 1 |
| Grade 12 | 1 |
| Grade 4 | 1 |
| Grade 8 | 1 |
| High Schools | 1 |
| Secondary Education | 1 |
Laws, Policies, & Programs
| No Child Left Behind Act 2001 | 2 |
| Education Consolidation… | 1 |
Assessments and Surveys
| National Assessment of… | 2 |
| North Carolina End of Course… | 1 |
What Works Clearinghouse Rating
Wind, Stefanie A. – Educational Measurement: Issues and Practice, 2017
Mokken scale analysis (MSA) is a probabilistic-nonparametric approach to item response theory (IRT) that can be used to evaluate fundamental measurement properties with less strict assumptions than parametric IRT models. This instructional module provides an introduction to MSA as a probabilistic-nonparametric framework in which to explore…
Descriptors: Probability, Nonparametric Statistics, Item Response Theory, Scaling
Wothke, Werner; Burket, George; Chen, Li-Sue; Gao, Furong; Shu, Lianghua; Chia, Mike – Journal of Educational and Behavioral Statistics, 2011
It has been known for some time that item response theory (IRT) models may exhibit a likelihood function of a respondent's ability which may have multiple modes, flat modes, or both. These conditions, often associated with guessing of multiple-choice (MC) questions, can introduce uncertainty and bias to ability estimation by maximum likelihood…
Descriptors: Educational Assessment, Item Response Theory, Computation, Maximum Likelihood Statistics
Wolf, Kenneth; Dunlap, Joanna; Stevens, Ellen – Journal of Effective Teaching, 2012
This article describes ten key assessment practices for advancing student learning that all professors should be familiar with and strategically incorporate in their classrooms and programs. Each practice or concept is explained with examples and guidance for putting it into practice. The ten are: learning outcomes, performance assessments,…
Descriptors: Educational Assessment, Student Evaluation, Educational Practices, Outcomes of Education
National Assessment Governing Board, 2012
As the ongoing national indicator of what American students know and can do, the National Assessment of Educational Progress (NAEP) in Reading regularly collects achievement information on representative samples of students in grades 4, 8, and 12. Through The Nation's Report Card, the NAEP Reading Assessment reports how well students perform in…
Descriptors: Reading Achievement, National Competency Tests, Reading Comprehension, Grade 4
Draper, Stephen W. – British Journal of Educational Technology, 2009
One technology for education whose adoption is currently expanding rapidly in UK higher education is that of electronic voting systems (EVS). As with all educational technology, whether learning benefits are achieved depends not on the technology but on whether an improved teaching method is introduced with it. EVS inherently relies on the…
Descriptors: Educational Technology, Teaching Methods, Higher Education, Foreign Countries
Herman, Joan L.; Osmundson, Ellen; Dietel, Ronald – Assessment and Accountability Comprehensive Center, 2010
This report describes the purposes of benchmark assessments and provides recommendations for selecting and using benchmark assessments--addressing validity, alignment, reliability, fairness and bias and accessibility, instructional sensitivity, utility, and reporting issues. We also present recommendations on building capacity to support schools'…
Descriptors: Multiple Choice Tests, Test Items, Benchmarking, Educational Assessment
Malamitsa, Katerina; Kokkotas, Panagiotis; Kasoutas, Michael – Science Education International, 2008
In contemporary academic literature and in many national curricula, there is a widespread acceptance that critical thinking should be an important dimension of Education. Teachers and researchers recognize the importance of developing students critical thinking, but there are still great difficulties in defining and assessing critical-thinking…
Descriptors: Delphi Technique, Reading Comprehension, Foreign Countries, Critical Thinking
Vacc, Nicholas A.; Loesch, Larry C.; Lubik, Ruth E. – 2001
Multiple choice tests are widely viewed as the most effective and objective means of assessment. Item development is the central component of creating an effective test, but test developers often do not have the background in item development. This document describes recall, application, and analysis, the three cognitive levels of test items. It…
Descriptors: Educational Assessment, Evaluation, Item Analysis, Measures (Individuals)
Radwan, Nizam; Rogers, W. Todd – Alberta Journal of Educational Research, 2006
The recent increase in the use of constructed-response items in educational assessment and the dissatisfaction with the nature of the decision that the judges must make using traditional standard-setting methods created a need to develop new and effective standard-setting procedures for tests that include both multiple-choice and…
Descriptors: Criticism, Cutting Scores, Educational Assessment, Standard Setting (Scoring)
Johnson, Matthew S.; Sinharay, Sandip – 2003
For complex educational assessments, there is an increasing use of "item families," which are groups of related items. However, calibration or scoring for such an assessment requires fitting models that take into account the dependence structure inherent among the items that belong to the same item family. C. Glas and W. van der Linden…
Descriptors: Bayesian Statistics, Constructed Response, Educational Assessment, Estimation (Mathematics)
Baron, Joan Boykoff; And Others – 1981
Connecticut's experience with four different standard-setting methods regarding multiple choice proficiency tests is described. The methods include Angoff, Nedelsky, Borderline Group, and Contrasting Groups Methods. All Connecticut ninth graders were administered proficiency tests in reading, language arts, and mathematics. As soon as final test…
Descriptors: Academic Standards, Basic Skills, Comparative Analysis, Cutting Scores
Buckendahl, Chad W.; Plake, Barbara S.; Impara, James C. – 1999
Many school districts are developing assessments that incorporate both selected response and constructed response formats. Scores on these assessments can be used for a variety of purposes ranging from subject remediation to promotion decisions. These policy decisions are informed by recommendations for Minimum Passing Scores (MPSs) from standard…
Descriptors: Academic Standards, Constructed Response, Cutting Scores, Educational Assessment
Education Commission of the States, Denver, CO. National Assessment of Educational Progress. – 1981
This handbook describes the procedures used to develop, administer, and analyze the results of the 1978-79 art assessment of 9-year-olds, and 17-year-olds by the National Assessment of Educational Progress (NAEP). The primary purpose of the handbook is to provide detailed procedural information for people interested in replicating the assessment…
Descriptors: Administration, Art Appreciation, Art Education, Art History
Kentucky State Dept. of Education, Frankfort. Div. of Curriculum and Assessment Development. – 1996
This document presents descriptions of the content that has been identified as essential for all Kentucky students to know and will be included on the state assessment, the Kentucky Instructional Results Information System (KIRIS). The core content in this document is designed to be used with, not instead of, the expectations outlined in…
Descriptors: Academic Achievement, Core Curriculum, Educational Assessment, Elementary Secondary Education
Peer reviewedHarris, Robert B.; Kerby, William C. – Journal of Economic Education, 1997
Recommends including essay questions on state economics examinations to prevent misclassification of students. Briefly reviews the literature arguing that certain groups of students do poorly on multiple choice tests. Discusses California's experience with adopting a combined-format type test. (MJP)
Descriptors: Academic Standards, Economics, Economics Education, Educational Assessment
Previous Page | Next Page ยป
Pages: 1 | 2
Direct link
