NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20260
Since 20250
Since 2022 (last 5 years)0
Since 2017 (last 10 years)1
Since 2007 (last 20 years)4
Audience
Researchers1
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 18 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Wise, Steven L.; Kingsbury, G. Gage – Journal of Educational Measurement, 2016
This study examined the utility of response time-based analyses in understanding the behavior of unmotivated test takers. For the data from an adaptive achievement test, patterns of observed rapid-guessing behavior and item response accuracy were compared to the behavior expected under several types of models that have been proposed to represent…
Descriptors: Achievement Tests, Student Motivation, Test Wiseness, Adaptive Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Zaidi, Nikki B.; Hwang, Charles; Scott, Sara; Stallard, Stefanie; Purkiss, Joel; Hortsch, Michael – Anatomical Sciences Education, 2017
Bloom's taxonomy was adopted to create a subject-specific scoring tool for histology multiple-choice questions (MCQs). This Bloom's Taxonomy Histology Tool (BTHT) was used to analyze teacher- and student-generated quiz and examination questions from a graduate level histology course. Multiple-choice questions using histological images were…
Descriptors: Taxonomy, Anatomy, Graduate Students, Scoring Formulas
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Jancarík, Antonín; Kostelecká, Yvona – Electronic Journal of e-Learning, 2015
Electronic testing has become a regular part of online courses. Most learning management systems offer a wide range of tools that can be used in electronic tests. With respect to time demands, the most efficient tools are those that allow automatic assessment. The presented paper focuses on one of these tools: matching questions in which one…
Descriptors: Online Courses, Computer Assisted Testing, Test Items, Scoring Formulas
Peer reviewed Peer reviewed
Direct linkDirect link
Fleet, Wendy – Accounting Education, 2013
As academics we often assume that allocating marks to a task will influence student decision-making when it comes to completing that task. Marks are used by lecturers to indicate the relative importance of each of the criteria used for marking the assessment task and we expect the student to respond to the marks' allocation. This Postcard suggests…
Descriptors: Task Analysis, Decision Making, Evaluation Criteria, Student Attitudes
Green, Bert F., Jr. – 1972
The use of Guttman weights in scoring tests is discussed. Scores of 2,500 men on one subtest of the CEED-SAT-Verbal Test were examined using cross-validated Guttman weights. Several scores were compared, as follows: Scores obtained from cross-validated Guttman weights; Scores obtained by rounding the Guttman weights to one digit, ranging from 0 to…
Descriptors: Comparative Analysis, Reliability, Scoring Formulas, Test Results
Peer reviewed Peer reviewed
Rowley, Glenn L.; Traub, Ross E. – Journal of Educational Measurement, 1977
The consequences of formula scoring versus number right scoring are examined in relation to the assumptions commonly made about the behavior of examinees in testing situations. The choice between the two is shown to be dependent upon having reduced error variance or unbiasedness as a goal. (Author/JKS)
Descriptors: Error of Measurement, Scoring Formulas, Statistical Bias, Test Wiseness
Love, Gayle A. – 1987
In a review of relevant literature, it is argued that correction for guessing formulas should not be used. It is contended that such formulas correct for guessing that does not really exist in a noticeable amount, penalize those students who have low self-esteem and self-confidence, correct for errors that are not necessarily errors, benefit risk…
Descriptors: Guessing (Tests), Scoring Formulas, Self Esteem, Teacher Made Tests
Peer reviewed Peer reviewed
Lord, Frederic M. – Journal of Educational Measurement, 1975
The assumption that examinees either know the answer to a test item or else guess at random is usually totally implausible. A different assumption is outlined, under which formula scoring is found to be clearly superior to number right scoring. (Author)
Descriptors: Guessing (Tests), Multiple Choice Tests, Response Style (Tests), Scoring
Bateman, Leslie Miller – 1977
As an informational aid for students who are planning to take NLN (National League of Nursing) Achievement Tests, the text and accompanying exercises in this module describe NLN testing procedures and fundamental test-taking skills. After introductory material discussing the importance of mastering test-taking skills, the module describes how to…
Descriptors: Achievement Tests, Allied Health Occupations Education, Guessing (Tests), Learning Modules
Peer reviewed Peer reviewed
Bliss, Leonard B. – Journal of Educational Measurement, 1980
A mathematics achievement test with instructions to avoid guessing wildly was given to 168 elementary school pupils who were later asked to complete all the questions using a differently colored pencil. Results showed examinees, particularly the more able students, tend to omit too many items. (CTM)
Descriptors: Anxiety, Guessing (Tests), Intermediate Grades, Multiple Choice Tests
Plake, Barbara S.; And Others – 1980
Number right and elimination scores were analyzed on a 48-item college level mathematics test that was assembled from pretest data in three forms by varying the item orderings: easy-hard, uniform, or random. Half of the forms contained information explaining the item arrangement and suggesting strategies for taking the test. Several anxiety…
Descriptors: Difficulty Level, Higher Education, Multiple Choice Tests, Quantitative Tests
Peer reviewed Peer reviewed
Jacobs, Stanley S. – Journal of Educational Measurement, 1975
Descriptors: Criterion Referenced Tests, Guessing (Tests), Multiple Choice Tests, Response Style (Tests)
Peer reviewed Peer reviewed
Cross, Lawrence; Frary, Robert – Journal of Educational Measurement, 1977
Corrected-for-guessing scores on multiple-choice tests depend upon the ability and willingness of examinees to guess when they have some basis for answering, and to avoid guessing when they have no basis. The present study determined the extent to which college students were able and willing to comply with formula-scoring directions. (Author/CTM)
Descriptors: Guessing (Tests), Higher Education, Individual Characteristics, Multiple Choice Tests
Abedi, Jamal; Bruno, James – Journal of Computer-Based Instruction, 1989
Reports the results of several test-reliability experiments which compared a modified confidence weighted-admissible probability measurement (MCW-APM) with conventional forced choice or binary type (R-W) test scoring methods. Psychometric properties using G theory and conventional correlational methods are examined, and their implications for…
Descriptors: Ability Grouping, Analysis of Variance, Computer Assisted Testing, Correlation
Melican, Gerald; Plake, Barbara S. – 1984
The validity of combining a correction for guessing with the Nedelsky-based cutscore was investigated. A five option multiple choice Mathematics Achievement Test was used in the study. Items were selected to meet several criteria. These included: the capability of measuring mathematics concepts related to performance in introductory statistics;…
Descriptors: Cutting Scores, Guessing (Tests), Higher Education, Multiple Choice Tests
Previous Page | Next Page »
Pages: 1  |  2