Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 5 |
| Since 2007 (last 20 years) | 6 |
Descriptor
| Credibility | 8 |
| Program Evaluation | 8 |
| Evaluation Methods | 4 |
| Extension Education | 4 |
| Data Collection | 3 |
| Evidence | 3 |
| Capacity Building | 2 |
| Cooperative Education | 2 |
| Decision Making | 2 |
| Needs Assessment | 2 |
| Academic Education | 1 |
| More ▼ | |
Source
| Journal of Human Sciences &… | 3 |
| American Journal of Evaluation | 1 |
| Educational Evaluation and… | 1 |
| Journal of Applied Testing… | 1 |
| Journal of Extension | 1 |
| PS: Political Science and… | 1 |
Author
| Allison M. Teeter | 1 |
| Chazdon, Scott | 1 |
| Chelsea Hetherington | 1 |
| Cheryl Eschbach | 1 |
| Cook, James R. | 1 |
| Courtney Cuthbertson | 1 |
| Eugenia P. Gwynn | 1 |
| Felix, Joseph L. | 1 |
| Kenneth R. Jones | 1 |
| Lowry, Robert C. | 1 |
| Marc T. Braverman | 1 |
| More ▼ | |
Publication Type
| Journal Articles | 8 |
| Reports - Descriptive | 8 |
Education Level
Audience
Location
| Minnesota | 1 |
| Ohio (Cincinnati) | 1 |
Laws, Policies, & Programs
| Elementary and Secondary… | 1 |
Assessments and Surveys
What Works Clearinghouse Rating
Marc T. Braverman – Journal of Human Sciences & Extension, 2019
This article examines the concept of credible evidence in Extension evaluations with specific attention to the measures and measurement strategies used to collect and create data. Credibility depends on multiple factors, including data quality and methodological rigor, characteristics of the stakeholder audience, stakeholder beliefs about the…
Descriptors: Extension Education, Program Evaluation, Evaluation Methods, Planning
Kenneth R. Jones; Eugenia P. Gwynn; Allison M. Teeter – Journal of Human Sciences & Extension, 2019
This article provides insight into how an adequate approach to selecting methods can establish credible and actionable evidence. The authors offer strategies to effectively support Extension professionals, including program developers and evaluators, in being more deliberate when selecting appropriate qualitative and quantitative methods. In…
Descriptors: Evaluation Methods, Credibility, Evidence, Evaluation Criteria
Chazdon, Scott; Meyer, Nathan; Mohr, Caryn; Troschinetz, Alexis – Journal of Extension, 2017
The public value poster session is a new tool for effectively demonstrating and reporting the public value of Extension programming. Akin to the research posters that have long played a critical role in the sharing of findings from academic studies, the public value poster provides a consistent format for conveying the benefits to society of…
Descriptors: Extension Education, Program Evaluation, Educational Benefits, Capacity Building
Chelsea Hetherington; Cheryl Eschbach; Courtney Cuthbertson – Journal of Human Sciences & Extension, 2019
Evaluation capacity building (ECB) is an essential element for generating credible and actionable evidence on Extension programs. This paper offers a discussion of ECB efforts in Cooperative Extension and how such efforts enable Extension professionals to collect and use credible and actionable evidence on the quality and impacts of programs.…
Descriptors: Cooperative Education, Extension Education, Capacity Building, Program Evaluation
Schuwirth, Lambert W. T.; Van Der Vleuten, Cees P. M. – Journal of Applied Testing Technology, 2019
Programmatic assessment is both a philosophy and a method for assessment. It has been developed in medical education as a response to the limitation of the dominant testing or measurement approaches and to better align with changes in how medical competence was conceptualised. It is based on continual collection of assessment and feedback…
Descriptors: Program Evaluation, Medical Education, Competency Based Education, Feedback (Response)
Cook, James R. – American Journal of Evaluation, 2015
Program evaluation is generally viewed as a set of mechanisms for collecting and using information to learn about projects, policies and programs, to understand their effects as well as the manner in which they are implemented. AEA has espoused principles for evaluation that place emphasis on competent, honest inquiry that respects the security,…
Descriptors: Program Evaluation, Social Justice, Community Change, Change Strategies
Peer reviewedFelix, Joseph L. – Educational Evaluation and Policy Analysis, 1979
Evaluation procedures and programs in the Cincinnati, Ohio school system, the role of local school evaluators, and the models for school evaluation--based on high, moderate, or low trust--are described. Evaluators serve local schools in formative and summative evaluation projects, in assessing needs, and in meeting them. (MH)
Descriptors: Credibility, Evaluation Methods, Evaluation Needs, Evaluators
Peer reviewedLowry, Robert C.; Silver, Brian D. – PS: Political Science and Politics, 1996
Asserts that variance between a university's reputation as an institution and its commitment to research have a greater impact on political science department rankings than any internal factors within the department. Includes several tables showing statistical variables of department and university rankings. (MJP)
Descriptors: Academic Education, Achievement Rating, Analysis of Variance, Credibility

Direct link
