NotesFAQContact Us
Collection
Advanced
Search Tips
What Works Clearinghouse Rating
Showing 1,531 to 1,545 of 3,126 results Save | Export
Peer reviewed Peer reviewed
Hartley, James; Trueman, Mark – Journal of Research in Reading, 1986
Reports on two studies of the effect of different typographic settings on the speed and accuracy of responses to cloze procedure reading tests. Concludes that in-text responding and dashes produce significantly higher scores. (SRT)
Descriptors: Cloze Procedure, Layout (Publications), Reading Comprehension, Reading Research
Peer reviewed Peer reviewed
Shaha, Steven H. – Educational and Psychological Measurement, 1984
It was hypothesized that matching test formats would reduce test anxiety. Three experiments were conducted in which high school juniors and seniors took parallel matching and multiple-choice tests covering topics of prior knowledge or recently learned information. Results showed that matching tests were superior to multiple choice formats.…
Descriptors: High Schools, Multiple Choice Tests, Objective Tests, Scores
Peer reviewed Peer reviewed
Miller, Samuel D.; Smith, Donald E. P. – Journal of Educational Psychology, 1985
Reading test questions were classified as literal or inferential. The kind of question was controlled to determine the influence of test format on comprehension. Analysis of variance indicated no direct effects attributable to test format or kinds of comprehension. Contentions of deficits in automaticity and attentional focus in poor readers were…
Descriptors: Elementary Education, Oral Reading, Reading Ability, Reading Comprehension
Peer reviewed Peer reviewed
Sudman, Seymour; Bradburn, Norman – New Directions for Program Evaluation, 1984
Situations in which mailed questionnaires are most appropriate are identified. Population variables, characteristics of questionnaires, and social desirability variables are examined in depth. (Author)
Descriptors: Attitude Measures, Evaluation Methods, Program Evaluation, Research Methodology
Troyka, Lynn Quitman – Writing Program Administration, 1984
Defends the CUNY-WAT against the charges made by Fishman (CS 731 865). Offers suggestions for those wishing to undertake research into the choice of topics for writing assessment tests. (FL)
Descriptors: Essay Tests, Higher Education, Test Format, Test Items
Peer reviewed Peer reviewed
Kempa, R. F.; L'Odiaga, J. – Educational Research, 1984
Examines the extent to which grades derived from a conventional norm-referenced examination can be interpreted in terms of criterion-referenced performance assessments of different abilities and skills. Results suggest that performance is more affected by test format and subject matter than by the intellectual abilities tested by them. (JOW)
Descriptors: Criterion Referenced Tests, Norm Referenced Tests, Test Construction, Test Format
Peer reviewed Peer reviewed
Plake, Barbara S.; And Others – Educational and Psychological Measurement, 1983
The purpose of this study was to investigate further the effect of differential item performance by males and females on tests which have different item arrangements. The study allows for a more accurate evaluation of whether differential sensitivity to reinforcement strategies is a factor in performance discrepancies for males and females.…
Descriptors: Feedback, Higher Education, Performance Factors, Quantitative Tests
Caldwell, Robert M.; Marcel, Marvin – Training, 1985
Examines Southwestern Bell's Interdepartmental Training Center's program of providing objective evaluations of trainers and the training process. Elements that are discussed include the evaluation format, the form of the evaluation instrument and its emphasis, the validation process, and refinements to the system. (CT)
Descriptors: Evaluation Methods, Guidelines, Teacher Evaluation, Test Construction
Peer reviewed Peer reviewed
Staver, John R. – Journal of Research in Science Teaching, 1984
Determined effects of various methods and formats on subjects' (N=253) responses to a Piagetian reasoning problem requiring control of variables. Results indicate that format but not method of task administration influences subjects' performance and that the influence is similar for various combinations of methods and format. (Author/JN)
Descriptors: Biological Sciences, Cognitive Processes, College Science, Comparative Testing
Peer reviewed Peer reviewed
Kolstad, Rosemarie K.; And Others – Educational Research Quarterly, 1983
Complex multiple choice (CMC) items are frequently used to test knowledge about repetitive information. In two independent comparisons, performance on the CMC items surpassed that of the multiple true-false clusters. Data indicate that performance on CMC items is inflated, and distractors on CMC items fail to prevent guessing. (Author/PN)
Descriptors: Guessing (Tests), Higher Education, Multiple Choice Tests, Objective Tests
Katz, Irvin R.; Xi, Xiaoming; Kim, Hyun-Joo; Cheng, Peter C. H. – Educational Testing Service, 2004
This research applied a cognitive model to identify item features that lead to irrelevant variance on the Test of Spoken English[TM] (TSE[R]). The TSE is an assessment of English oral proficiency and includes an item that elicits a description of a statistical graph. This item type sometimes appears to tap graph-reading skills--an irrelevant…
Descriptors: Test Format, English, Test Items, Language Proficiency
DeMauro, Gerald E. – 2001
Several analyses of the construct validity of the fourth-grade, eighth-grade, and commencement-level English and Mathematics examinations of New York state were performed. The analyses present construct and differential construct elaboration both across tests and within tests. Results show strong relationships among different question types,…
Descriptors: Ability, Achievement Tests, Construct Validity, Elementary Secondary Education
Meyers, Judith N. – 1997
The test-preparation program in this guide covers all forms of test taking to help students deal with real-world problems like test anxiety and insufficient preparation time. The chapters are: (1) "Finding Out about the Tests You Must Take"; (2) "Making a Study Plan"; (3) "Carrying Out Your Study Plan"; (4) "Learning Strategies"; (5) "Coping with…
Descriptors: Guessing (Tests), Higher Education, Secondary Education, Study Skills
Haladyna, Thomas M. – 1999
This book explains writing effective multiple-choice test items and studying responses to items to evaluate and improve them, two topics that are very important in the development of many cognitive tests. The chapters are: (1) "Providing a Context for Multiple-Choice Testing"; (2) "Constructed-Response and Multiple-Choice Item Formats"; (3)…
Descriptors: Constructed Response, Multiple Choice Tests, Test Construction, Test Format
Peer reviewed Peer reviewed
Silverstein, A. B. – Perceptual and Motor Skills, 1982
Estimates of the validity of random short forms can serve as benchmarks against which to appraise the validity of particular short forms. Formulas are presented for estimating the validity of random short forms and illustrated with Wechsler Adult Intelligence Scale-Revised (WAIS-R) and Minnesota Multiphasic Personality Inventory data. (Author/CM)
Descriptors: Evaluation Methods, Intelligence Tests, Mathematical Formulas, Personality Measures
Pages: 1  |  ...  |  99  |  100  |  101  |  102  |  103  |  104  |  105  |  106  |  107  |  ...  |  209