Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 0 |
| Since 2007 (last 20 years) | 2 |
Descriptor
Source
Author
| Dudley-Marling, Curt | 2 |
| Balagopalan, Santosh | 1 |
| Black, John B. | 1 |
| Byrum, David C. | 1 |
| Callison, Daniel | 1 |
| Curlette, William L. | 1 |
| Daniel, Esther G. S. | 1 |
| Guilar, Joshua | 1 |
| Incikabi, Lutfi | 1 |
| Kerka, Sandra | 1 |
| Lyng, Mary | 1 |
| More ▼ | |
Publication Type
Education Level
| Elementary Education | 1 |
| Higher Education | 1 |
| Postsecondary Education | 1 |
Audience
| Practitioners | 1 |
| Teachers | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Sathasivam, Renuka V.; Daniel, Esther G. S. – Asia-Pacific Forum on Science Learning and Teaching, 2016
There is an accumulating research base that supports the effectiveness of formative assessment practices in enhancing the quality of educational outcomes, yet research findings seem to indicate sluggish implementation of these formative assessment strategies in the classrooms. Many factors influence teachers' formative assessment practices…
Descriptors: Science Teachers, Science Instruction, Formative Evaluation, Qualitative Research
Incikabi, Lutfi; Sancar Tokmak, Hatice – Educational Media International, 2012
This case study examined the educational software evaluation processes of pre-service teachers who attended either expertise-based training (XBT) or traditional training in conjunction with a Software-Evaluation checklist. Forty-three mathematics teacher candidates and three experts participated in the study. All participants evaluated educational…
Descriptors: Foreign Countries, Novices, Check Lists, Mathematics Education
Machell, Joan; Saunders, Murray – Aspects of Educational and Training Technology Series, 1992
The MEDA (Methodologie d'Evaluation des Didacticiels pour les Adultes) tool is a generic instrument to evaluate training courseware. It was developed for software designers to improve products, for instructors to select appropriate courseware, and for distributors and consultants to match software to client needs. Describes software evaluation…
Descriptors: Comparative Analysis, Competence, Computer Assisted Instruction, Computer Software Development
Peer reviewedMacKnight, Carol B.; Balagopalan, Santosh – Journal of Educational Technology Systems, 1989
Compares the strengths and weaknesses of four authoring systems that can be used for courseware development: (1) Quest; (2) PCD3; (3) IconAuthor; and (4) Course of Action. Evaluation procedures used to assess their power, ease of use, and productivity aids are explained; menu and icon structures are described; and interactive design implications…
Descriptors: Authoring Aids (Programing), Comparative Analysis, Courseware, Evaluation Methods
Tucker, Richard N. – 1989
The International Council for Educational Media (ICEM) was asked to conduct a study based on reports from a selected number of the ICEM member countries representing different styles of educational organization (from centralized to decentralized), different organizational methods for the production of educational software (from governmental to…
Descriptors: Comparative Analysis, Computer Assisted Instruction, Computer Software, Courseware
Peer reviewedOwston, Ronald D.; Dudley-Marling, Curt – Journal of Research on Computing in Education, 1988
Reviews current educational software evaluation methods, highlights problems, and describes the York Educational Software Evaluation Scales (YESES), an alternative criterion based model. Panel evaluation used by YESES is explained and YESES results are compared with evaluations from the Educational Products Information Exchange (EPIE) to indicate…
Descriptors: Comparative Analysis, Computer Assisted Instruction, Correlation, Courseware
Peer reviewedDudley-Marling, Curt; And Others – Computers in the Schools, 1988
Discussion of the evaluation of educational software highlights two evaluation methods: (1) the York Educational Software Evaluation Scales (YESES), a criterion-based evaluation, and (2) a field-testing approach developed at York University. Guidelines for the field-testing approach include an emphasis on the observation of students. (22…
Descriptors: Comparative Analysis, Computer Assisted Instruction, Courseware, Elementary Secondary Education
Peer reviewedZahner, Jane E.; And Others – Educational Technology, Research and Development, 1992
Discusses the evaluation of instructional software, describes two versions of a software evaluation model, and reports the results of two studies that compared evaluative decisions based upon subjective software ratings and data collected using both versions of the evaluation model. Teacher use of student data for decision making is also examined.…
Descriptors: Comparative Analysis, Computer Assisted Instruction, Computer Software Evaluation, Courseware
Guilar, Joshua – Journal of Instruction Delivery Systems, 1994
Describes the development of an instructional technology-based training course for a Hewlett Packard manufacturing plant. The environment, an appropriately trained staff, and a development strategy with measurable goals are discussed; a comparison of conventional training with self-paced courseware is described; and learning effectiveness and…
Descriptors: Comparative Analysis, Computer Assisted Instruction, Conventional Instruction, Courseware
Curlette, William L.; And Others – 1991
The systematic evaluation process used in Georgia's DeKalb County School System to purchase comprehensive instructional software--an integrated learning system (ILS)--is described, and the decision-making model for selection is presented. Selection and implementation of an ILS were part of an instructional technology plan for the DeKalb schools…
Descriptors: Comparative Analysis, Computer Assisted Instruction, Computer Software Evaluation, Computer Software Selection
Smith, Richard Alan – Computing Teacher, 1988
Discusses how to examine and evaluate claims of improved academic performance in advertisements for computer-assisted instruction. Highlights include the proper use of comparison groups; types of statistical analyses; the Hawthorne effect; the interpretation of scores; interpreting graphic presentations; tests of significance; and cost…
Descriptors: Academic Achievement, Achievement Gains, Advertising, Comparative Analysis
Van Reeuwijk, Martin – Computing Teacher, 1994
Discussion of differences in elementary and secondary school mathematics instruction between the United States and The Netherlands focuses on a Dutch project brought to the United States that uses print materials and software products. Topics addressed include integrating technology into the curriculum; cultural differences; designing assessment;…
Descriptors: Comparative Analysis, Computer Assisted Instruction, Courseware, Cultural Differences
Wade, Vincent P.; Lyng, Mary – 2000
This paper proposes an automated, third party World Wide Web (WWW)-based evaluation service (i.e., an online questionnaire) that focuses on usability issues of WWW-based courseware and can be used by any WWW course instructor/student. Requirements related to the usability of educational systems are summarized, including the learning environment,…
Descriptors: Comparative Analysis, Computer Software Evaluation, Computer Uses in Education, Courseware
Peer reviewedByrum, David C. – Journal of Educational Computing Research, 1992
Discusses formative evaluation of courseware and describes a study of undergraduates that compared the effects of two methods of formative evaluation on the revision of a computer-assisted instruction (CAI) program written in HyperCard. Use of the one-to-one method and the small group method of evaluation are explained, and posttest scores are…
Descriptors: Analysis of Variance, Comparative Analysis, Computer Assisted Instruction, Computer Software Development
Peer reviewedWills, Sandra; McNaught, Carmel – Journal of Computing in Higher Education, 1996
Discussion of evaluation of computer-based learning (CBL) looks at six different approaches: use of quantitative data for assessment; student and staff attitudes and perceptions of computer use for learning; formative evaluation for CBL design; effectiveness of CBL in comparison to conventional instruction; evidence of whether CBL enables students…
Descriptors: College Instruction, Comparative Analysis, Computer Assisted Instruction, Computer Software Evaluation
Previous Page | Next Page ยป
Pages: 1 | 2
Direct link
