NotesFAQContact Us
Collection
Advanced
Search Tips
What Works Clearinghouse Rating
Showing 241 to 255 of 554 results Save | Export
Alker, Henry A.; and others – J Educ Psychol, 1969
Research supported by U.S. Public Health Service Research Grant No. 1 POL-01762-01. Portions presented at the American Psychological Association meeting (Washington, D.C., September 1967).
Descriptors: Academic Ability, Multiple Choice Tests, Rewards, Student Characteristics
Peer reviewed Peer reviewed
Hanna, Gerald S.; Johnson, Fred R. – Journal of Educational Research, 1978
After analyzing four methods of selecting distractor items for multiple-choice tests, the authors recommend that classroom teachers use their own judgment in choosing test items. (Ed.)
Descriptors: Multiple Choice Tests, Teacher Responsibility, Test Construction, Test Items
Peer reviewed Peer reviewed
Waters, Carrie Wherry; Waters, Lawrence K. – Educational and Psychological Measurement, 1971
Descriptors: Guessing (Tests), Multiple Choice Tests, Response Style (Tests), Scoring Formulas
Levine, Harold G.; and others – Amer Educ Res J, 1970
Descriptors: Achievement Tests, Factor Analysis, Measurement Instruments, Medicine
Peer reviewed Peer reviewed
Frary, Robert B. – Applied Psychological Measurement, 1980
Six scoring methods for assigning weights to right or wrong responses according to various instructions given to test takers are analyzed with respect to expected change scores and the effect of various levels of information and misinformation. Three of the methods provide feedback to the test taker. (Author/CTM)
Descriptors: Guessing (Tests), Knowledge Level, Multiple Choice Tests, Scores
Marrelli, Anne F. – Performance and Instruction, 1995
Discusses the advantages of using multiple choice questions, highlighting the flexibility of using different variations of questions. Item writing guidelines include information on content, sensitivity, difficulty, irrelevant sources of difficulty, order, misleads, avoidance of clues, and exercises in the application of guidelines. (JKP)
Descriptors: Distractors (Tests), Guidelines, Multiple Choice Tests, Questioning Techniques
Peer reviewed Peer reviewed
Perkins, Kyle; And Others – International Journal of Applied Linguistics, 1991
Discusses a concurrent validity study of an indirect measure of English-as-a-Second-Language writing based on information processing: anagram solving, word reordering, paragraph assembly tests, etc. The criterion measure was a direct, holistic measure of writing. Analysis revealed that the indirect measure did not exhibit concurrent validity for…
Descriptors: English (Second Language), Information Processing, Multiple Choice Tests, Test Validity
Peer reviewed Peer reviewed
Feigenbaum, David; Costello, Raymond M. – Journal of Personality Assessment, 1975
This study attempts to show that information about one aspect of test rationale affects performance in a predictable manner. (Author/DEP)
Descriptors: Behavior, College Students, Conditioning, Multiple Choice Tests
Peer reviewed Peer reviewed
Fairbrother, R. W. – Educational Research, 1975
Twenty-two teachers were asked to give their opinion as to the ability being tested by each item in two multiple-choice examination papers. (Author)
Descriptors: Academic Ability, Educational Research, Multiple Choice Tests, Student Characteristics
Brittain, Clay V.; Brittain, Mary M. – 1981
This study was concerned specifically with the performance of soldiers on Skill Qualification Tests (SQT) in relation to the readability of the tests. The paper also touches upon the effects of soldiers' motivation and training on SQT performance. Regarding readability, the study was designed to address three questions: does the readability of SQT…
Descriptors: Armed Forces, Military Personnel, Motivation, Multiple Choice Tests
Frary, Robert B. – 1980
Ordinal response modes for multiple choice tests are those under which the examinee marks one or more choices in an effort to identify the correct choice, or include it in a proper subset of the choices. Two ordinal response modes: answer-until-correct, and Coomb's elimination of choices which examinees identify as wrong, were analyzed for scoring…
Descriptors: Guessing (Tests), Multiple Choice Tests, Responses, Scoring
Schmeiser, Cynthia Board; Whitney, Douglas R. – 1973
Violations of four selected principles of writing multiple-choice items were introduced into an undergraduate religion course mid-term examination. Three of the flaws significantly increased test difficulty. KR-sub-20 values were lower for all of the tests containing the flawed items than for the "good" versions of the items but significantly so…
Descriptors: Item Analysis, Multiple Choice Tests, Research Reports, Test Construction
O'Reilly, Robert P.; Streeter, Ronald E. – 1976
The results of a series of factor analyses of a new test of literal comprehension using a multiple-choice cloze format are summarized. These analyses were conducted in the validation of a test design to measure for the most part a factor of literal comprehension independent of IQ and inferential reading processes, yet marked by certain related…
Descriptors: Cloze Procedure, Elementary Education, Factor Analysis, Multiple Choice Tests
Scherich, Henry; Hanna, Gerald – 1976
The reading comprehension items for the Nelson Reading Skills Test, a revision of a widely used standardized reading test, were administered to several hundred fourth- and sixth-grade students in order to determine whether the student's ability to answer correctly actually depended on his comprehension of the accompanying passage. All the…
Descriptors: Elementary Education, Multiple Choice Tests, Reading Comprehension, Reading Tests
Peer reviewed Peer reviewed
Owen, Steven V.; Froman, Robin D. – Educational and Psychological Measurement, 1987
To test further for efficacy of three-option achievement items, parallel three- and five-option item tests were distributed randomly to college students. Results showed no differences in mean item difficulty, mean discrimination or total test score, but a substantial reduction in time spent on three-option items. (Author/BS)
Descriptors: Achievement Tests, Higher Education, Multiple Choice Tests, Test Format
Pages: 1  |  ...  |  13  |  14  |  15  |  16  |  17  |  18  |  19  |  20  |  21  |  ...  |  37