Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 0 |
| Since 2007 (last 20 years) | 5 |
Descriptor
| Benchmarking | 5 |
| Knowledge Level | 5 |
| Statistical Analysis | 5 |
| Evaluation Methods | 3 |
| Foreign Countries | 3 |
| Student Evaluation | 3 |
| Accuracy | 2 |
| Classification | 2 |
| Expertise | 2 |
| Item Response Theory | 2 |
| Reading Skills | 2 |
| More ▼ | |
Source
| College Board | 1 |
| International Education… | 1 |
| International Journal of… | 1 |
| Journal of Learning Design | 1 |
| National Center for Research… | 1 |
Author
Publication Type
| Reports - Research | 5 |
| Journal Articles | 3 |
| Tests/Questionnaires | 1 |
Education Level
| Higher Education | 2 |
| Postsecondary Education | 2 |
| Secondary Education | 2 |
| High Schools | 1 |
Audience
Laws, Policies, & Programs
Assessments and Surveys
| Program for International… | 1 |
What Works Clearinghouse Rating
Golovachyova, Viktoriya N.; Menlibekova, Gulbakhyt Zh.; Abayeva, Nella F.; Ten, Tatyana L.; Kogaya, Galina D. – International Journal of Environmental and Science Education, 2016
Using computer-based monitoring systems that rely on tests could be the most effective way of knowledge evaluation. The problem of objective knowledge assessment by means of testing takes on a new dimension in the context of new paradigms in education. The analysis of the existing test methods enabled us to conclude that tests with selected…
Descriptors: Expertise, Computer Assisted Testing, Student Evaluation, Knowledge Level
Herman, Joan L.; La Torre, Deborah; Epstein, Scott; Wang, Jia – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2016
This report presents the results of expert panels' item-by-item analysis of the 2015 PISA Reading Literacy and Mathematics Literacy assessments and compares study findings on PISA's representation of deeper learning with that of other related studies. Results indicate that about 11% to 14% of PISA's total raw score value for reading and…
Descriptors: Achievement Tests, International Assessment, Foreign Countries, Secondary School Students
Acuña, Tina Botwright; Kelder, Jo-Anne; Able, Amanda J.; Guisard, Yann; Bellotti, William D.; McDonald, Glenn; Doyle, Richard; Wormell, Paul; Meinke, Holger – Journal of Learning Design, 2014
This paper reports on the perspective of industry stakeholders in a national project to develop a Learning and Teaching Academic Standards (LTAS) Statement for the Agriculture discipline. The AgLTAS Statement will be aligned with the Science LTAS Statement published in 2011 and comprise a discourse on the nature and extent of the Agriculture…
Descriptors: Academic Standards, Agricultural Education, Vocational Aptitude, Benchmarking
Arsad, Norhana; Kamal, Noorfazila; Ayob, Afida; Sarbani, Nizaroyani; Tsuey, Chong Sheau; Misran, Norbahiah; Husain, Hafizah – International Education Studies, 2013
This paper discusses the effectiveness of the early evaluation questions conducted to determine the academic ability of the new students in the Department of Electrical, Electronics and Systems Engineering. Questions designed are knowledge based--on what the students have learned during their pre-university level. The results show students have…
Descriptors: Foreign Countries, Academic Ability, Engineering Education, Item Response Theory
Wolfe, Edward W.; Myford, Carol M.; Engelhard, George, Jr.; Manalo, Jonathan R. – College Board, 2007
In this study, we investigated a variety of Reader effects that may influence the validity of ratings assigned to AP® English Literature and Composition essays. Specifically, we investigated whether Readers exhibit changes in their levels of severity and accuracy, and their use of individual scale categories over time. We refer to changes in these…
Descriptors: Advanced Placement Programs, Essays, English Literature, Writing (Composition)

Peer reviewed
