Descriptor
Source
| New Directions for Program… | 24 |
Author
| Weiss, Carol H. | 2 |
| Wortman, Paul M. | 2 |
| Altschuld, James W. | 1 |
| Boruch, Robert F. | 1 |
| Bryant, Fred B. | 1 |
| Caplan, Nathan | 1 |
| Cohen, David K. | 1 |
| Conner, Ross F. | 1 |
| Cuthbert, Marlene | 1 |
| Dillman, Don A. | 1 |
| Farrar, Eleanor | 1 |
| More ▼ | |
Publication Type
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Peer reviewedDillman, Don A.; And Others – New Directions for Program Evaluation, 1984
The empirical effects of adherence of details of the Total Design Method (TDM) approach to the design of mail surveys is discussed, based on the implementation of a common survey in 11 different states. The results suggest that greater adherence results in higher response, especially in the later stages of the TDM. (BW)
Descriptors: Questionnaires, Research Methodology, Research Problems, State Surveys
Peer reviewedNorris, Stephen P. – New Directions for Program Evaluation, 1983
Construct validation theory is founded upon conflicting metaphysical principles, methodological approaches, and standards of adequacy. This paper explores these unrecognized conflicts and indicates some consequences that these conflicts have for construct validation theory. (Author/PN)
Descriptors: Evaluation Methods, Realism, Research Needs, Research Problems
Peer reviewedNilsson, Neil; Hogben, Donald – New Directions for Program Evaluation, 1983
The authors criticize the value-free notion of social science and evaluation. They particularly assail relativists, those who confuse the making of reliable value judgments with how these value judgments are used. (Author/PN)
Descriptors: Evaluation Needs, Meta Evaluation, Research Needs, Research Problems
Peer reviewedScriven, Michael – New Directions for Program Evaluation, 1983
The author argues that one of the worst aspects of logical positivism is its attempt to construct a value-free science and to exclude values from the scientific process. (Author/PN)
Descriptors: Evaluation Methods, Evaluation Needs, Measurement Objectives, Research Needs
Peer reviewedMurray, Charles A. – New Directions for Program Evaluation, 1983
The author feels that the stakeholder approach unavoidably pushes the evaluation toward technical compromises and accommodations that diminish the long-term gains in knowledge. (PN)
Descriptors: Case Studies, Educational Researchers, Evaluation Methods, Measurement Techniques
Peer reviewedYeaton, William H.; Wortman, Paul M. – New Directions for Program Evaluation, 1984
Solutions to methodological problems in medical technologies research synthesis relating to temporal change, persons receiving treatment, research design, analytic procedures, and threats to validity are presented. These solutions should help with the planning and methodogy for research synthesis in other areas. (BS)
Descriptors: Medical Research, Meta Analysis, Patients, Research Design
Peer reviewedGold, Norman – New Directions for Program Evaluation, 1983
Stakeholder-based studies modify the relationship between evaluator and user, producing conflict that can cause evaluators to regress to familiar, traditional patterns that leave interactive evaluation strategies only partially implemented. (Author/PN)
Descriptors: Case Studies, Evaluation Methods, Evaluation Utilization, Program Evaluation
Peer reviewedConner, Ross F. – New Directions for Program Evaluation, 1980
Is it ethical to select clients at random for a beneficial social service, then deny the benefits to a control group for the sake of science? Participation of control groups in planning, implementation and evaluation of social programs may resolve ethical issues. (Author/CP)
Descriptors: Control Groups, Ethics, Evaluation Methods, Program Evaluation
Peer reviewedHedges, Larry V. – New Directions for Program Evaluation, 1984
The adequacy of traditional effect size measures for research synthesis is challenged. Analogues to analysis of variance and multiple regression analysis for effect sizes are presented. The importance of tests for the consistency of effect sizes in interpreting results, and problems in obtaining well-specified models for meta-analysis are…
Descriptors: Analysis of Variance, Effect Size, Mathematical Models, Meta Analysis
Peer reviewedWeiss, Carol H. – New Directions for Program Evaluation, 1983
Analysis of the assumptions underlying the stakeholder approach to evaluation combined with the limited experience in testing the approach reported in this volume, suggests that some claims are cogent and others problematical. (Author)
Descriptors: Case Studies, Decision Making, Evaluation Criteria, Evaluation Methods
Peer reviewedBryant, Fred B.; Wortman, Paul M. – New Directions for Program Evaluation, 1984
Methods for selecting relevant and appropriate quasi-experimental studies for inclusion in research synthesis using the threats-to-validity approach are presented. Effects of including and excluding studies are evaluated. (BS)
Descriptors: Evaluation Criteria, Meta Analysis, Quasiexperimental Design, Research Methodology
Peer reviewedMerryfield, Merry M. – New Directions for Program Evaluation, 1985
Interviews with 26 experienced international evaluators are reported to identify problems and possible solutions in cross cultural evaluations (evaluations of programs or projects in developing countries by persons from the industrialized West). It is concluded that understanding cultural norms and values increases an evaluator's ability to…
Descriptors: Cultural Differences, Developing Nations, Ethnocentrism, Evaluation Methods
Peer reviewedCuthbert, Marlene – New Directions for Program Evaluation, 1985
Variations in perceptions of evaluation, research methods, time, social class, and priorities are only some of the reasons why evaluation work in a Third World setting calls evaluators' basic assumptions about people and society into question. Evaluation experiences in the Caribbean are used to illustrate practical problems that may arise.…
Descriptors: Cross Cultural Studies, Cultural Differences, Developing Nations, Evaluation Methods
Peer reviewedFarrar, Eleanor; House, Ernest R. – New Directions for Program Evaluation, 1983
The evaluators of Push/Excel assumed that it was a systematically developed program with measurable outcomes, not a charismatically inspired movement whose effects would be hard to pin down. As a result, neither the program nor the evaluation approach used were adequately tested. (Author)
Descriptors: Case Studies, Evaluation Methods, Program Development, Program Evaluation
Peer reviewedCaplan, Nathan – New Directions for Program Evaluation, 1980
Ethical issues; institutional characteristics and organizational arrangements; and roles and skills of the researcher must be considered if the government is to apply social science research findings more widely for the public good. (Available from: Jossey-Bass, Inc., 433 California St., San Francisco, CA 94104, single issue, $6.95.) (Author/GDC)
Descriptors: Government Role, Information Utilization, Program Evaluation, Research Needs
Previous Page | Next Page ยป
Pages: 1 | 2

