Publication Date
| In 2026 | 0 |
| Since 2025 | 62 |
| Since 2022 (last 5 years) | 388 |
| Since 2017 (last 10 years) | 831 |
| Since 2007 (last 20 years) | 1345 |
Descriptor
Source
Author
Publication Type
Education Level
Audience
| Practitioners | 195 |
| Teachers | 161 |
| Researchers | 93 |
| Administrators | 50 |
| Students | 34 |
| Policymakers | 15 |
| Parents | 12 |
| Counselors | 2 |
| Community | 1 |
| Media Staff | 1 |
| Support Staff | 1 |
| More ▼ | |
Location
| Canada | 63 |
| Turkey | 59 |
| Germany | 41 |
| United Kingdom | 37 |
| Australia | 36 |
| Japan | 35 |
| China | 33 |
| United States | 32 |
| California | 25 |
| Iran | 25 |
| United Kingdom (England) | 25 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Hartman, Rhona C.; Redden, Martha Ross – 1985
The fact sheet focuses on considerations when testing adaptations are needed, provides some facts about disability, and identifies a variety of adaptations of testing procedures which have been developed and successfully used in schools, vocational training programs, and on college campuses. Testing adaptations are discussed in terms of disability…
Descriptors: College Students, Disabilities, Evaluation Methods, Higher Education
Crehan, Kevin; Haladyna, Thomas M. – 1989
The present study involved the testing of two common multiple-choice item writing rules. A recent review of research revealed that much of the advice given for writing multiple-choice test items is based on experience and wisdom rather than on empirical research. The rules assessed in this study include: (1) the phrasing of the stem in the form of…
Descriptors: College Students, Higher Education, Multiple Choice Tests, Psychology
Peer reviewedTrigwell, Keith – Assessment and Evaluation in Higher Education, 1987
Preliminary findings concerning crib card examinations, in which students bring their own notes for use during testing, show that while crib card test results were similar to those from traditional examinations, students' anxiety was reduced, and testing time was saved, claims that the cards may enhance learning were not substantiated. (MSE)
Descriptors: Evaluation Methods, Higher Education, Learning Processes, Notetaking
Peer reviewedDawis, Rene V. – Journal of Counseling Psychology, 1987
Discusses design, development, and evaluation of scales used in counseling psychology research. Describes methods of scale construction including the Thurstone, Q-sort, rank-order methods, Likert, semantic differential, Guttman, Rasch, and external criterion methods. Presents ways of evaluating newly developed scales. Discusses measurement versus…
Descriptors: Counseling, Measures (Individuals), Multidimensional Scaling, Psychology
Peer reviewedEaves, Ronald C.; Smith, Earl – Journal of Experimental Education, 1986
The effects of examination format and previous experience with microcomputers on the test scores of 96 undergraduate students were investigated. Results indicated no significant differences in the scores obtained on the two types of test administration (microcomputer and traditional paper and pencil). Computer experience was not an important…
Descriptors: College Students, Computer Assisted Testing, Educational Media, Higher Education
Peer reviewedLazonby, John N.; And Others – Journal of Chemical Education, 1985
Two tests were administered to 652 students to investigate the effect of structuring questions about the mole. A third test was administered to see whether each step/operation was intrinsically difficult or if it was only difficult when part of a series of steps/operations. Findings are reported and discussed. (JN)
Descriptors: Academic Achievement, Chemistry, Science Education, Science Tests
Dick, Walter – Performance and Instruction, 1986
Addresses the four questions raised by Yelon about use of pretests with instruction--should I pretest; on what should I pretest; how should I pretest; how should I adjust to results--and answers them from the perspective of instructional designers conducting formative evaluations to improve quality of instruction under development. (MBR)
Descriptors: Formative Evaluation, Instructional Design, Material Development, Motivation
Peer reviewedHartley, James; Trueman, Mark – Journal of Research in Reading, 1986
Reports on two studies of the effect of different typographic settings on the speed and accuracy of responses to cloze procedure reading tests. Concludes that in-text responding and dashes produce significantly higher scores. (SRT)
Descriptors: Cloze Procedure, Layout (Publications), Reading Comprehension, Reading Research
Peer reviewedShaha, Steven H. – Educational and Psychological Measurement, 1984
It was hypothesized that matching test formats would reduce test anxiety. Three experiments were conducted in which high school juniors and seniors took parallel matching and multiple-choice tests covering topics of prior knowledge or recently learned information. Results showed that matching tests were superior to multiple choice formats.…
Descriptors: High Schools, Multiple Choice Tests, Objective Tests, Scores
Peer reviewedMiller, Samuel D.; Smith, Donald E. P. – Journal of Educational Psychology, 1985
Reading test questions were classified as literal or inferential. The kind of question was controlled to determine the influence of test format on comprehension. Analysis of variance indicated no direct effects attributable to test format or kinds of comprehension. Contentions of deficits in automaticity and attentional focus in poor readers were…
Descriptors: Elementary Education, Oral Reading, Reading Ability, Reading Comprehension
Peer reviewedSudman, Seymour; Bradburn, Norman – New Directions for Program Evaluation, 1984
Situations in which mailed questionnaires are most appropriate are identified. Population variables, characteristics of questionnaires, and social desirability variables are examined in depth. (Author)
Descriptors: Attitude Measures, Evaluation Methods, Program Evaluation, Research Methodology
Troyka, Lynn Quitman – Writing Program Administration, 1984
Defends the CUNY-WAT against the charges made by Fishman (CS 731 865). Offers suggestions for those wishing to undertake research into the choice of topics for writing assessment tests. (FL)
Descriptors: Essay Tests, Higher Education, Test Format, Test Items
Peer reviewedKempa, R. F.; L'Odiaga, J. – Educational Research, 1984
Examines the extent to which grades derived from a conventional norm-referenced examination can be interpreted in terms of criterion-referenced performance assessments of different abilities and skills. Results suggest that performance is more affected by test format and subject matter than by the intellectual abilities tested by them. (JOW)
Descriptors: Criterion Referenced Tests, Norm Referenced Tests, Test Construction, Test Format
Peer reviewedPlake, Barbara S.; And Others – Educational and Psychological Measurement, 1983
The purpose of this study was to investigate further the effect of differential item performance by males and females on tests which have different item arrangements. The study allows for a more accurate evaluation of whether differential sensitivity to reinforcement strategies is a factor in performance discrepancies for males and females.…
Descriptors: Feedback, Higher Education, Performance Factors, Quantitative Tests
Caldwell, Robert M.; Marcel, Marvin – Training, 1985
Examines Southwestern Bell's Interdepartmental Training Center's program of providing objective evaluations of trainers and the training process. Elements that are discussed include the evaluation format, the form of the evaluation instrument and its emphasis, the validation process, and refinements to the system. (CT)
Descriptors: Evaluation Methods, Guidelines, Teacher Evaluation, Test Construction


