Publication Date
| In 2026 | 0 |
| Since 2025 | 89 |
| Since 2022 (last 5 years) | 457 |
| Since 2017 (last 10 years) | 1245 |
| Since 2007 (last 20 years) | 2519 |
Descriptor
Source
Author
Publication Type
Education Level
Audience
| Practitioners | 122 |
| Teachers | 105 |
| Researchers | 64 |
| Students | 46 |
| Administrators | 14 |
| Policymakers | 7 |
| Counselors | 3 |
| Parents | 3 |
Location
| Canada | 134 |
| Turkey | 131 |
| Australia | 123 |
| Iran | 66 |
| Indonesia | 61 |
| Germany | 51 |
| United Kingdom | 51 |
| Taiwan | 46 |
| United States | 43 |
| China | 39 |
| California | 35 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
| Meets WWC Standards without Reservations | 3 |
| Meets WWC Standards with or without Reservations | 5 |
| Does not meet standards | 6 |
Watson, James, Jr. – Computing Teacher, 1983
Program specifics, problems, modification, and use of a general quiz program that can be utilized by teachers with essentially no programing experience are described. A printout of the program is included. Written in BASIC for the TRS-80 I/II, it can be adapted to any computer. (MBR)
Descriptors: Computer Assisted Testing, Computer Programs, Elementary Secondary Education, Microcomputers
Peer reviewedOlsen, Henry D.; Barickowski, Robert S. – Child Study Journal, 1976
Investigated whether students would perceive items arranged in a hard-medium-easy order as being more difficult than the same items arranged in an easy-medium-hard order and whether the arrangements would influence scores. (MS)
Descriptors: College Students, Educational Testing, Elementary Secondary Education, Higher Education
Fitzpatrick, Anne R. – 2002
This study, one of a series designed to answer practical questions about performance based assessment, examined the comparability of school scores on short, nonparallel test forms. The data were obtained from mathematics tests with both multiple choice (MC) and performance assessment (PA) items. The tests were administered in a statewide testing…
Descriptors: Comparative Analysis, Mathematics Tests, Multiple Choice Tests, Performance Based Assessment
McCowan, Richard J. – Online Submission, 1999
Item writing is a major responsibility of trainers. Too often, qualified staff who prepare lessons carefully and teach conscientiously use inadequate tests that do not validly reflect the true level of trainee achievement. This monograph describes techniques for constructing multiple-choice items that measure student performance accurately. It…
Descriptors: Multiple Choice Tests, Item Analysis, Test Construction, Test Items
PDF pending restorationSchulz, E. Matthew; Kolen, Michael J.; Nicewander, W. Alan – 1997
This paper compares modified Guttman and item response theory (IRT) based procedures for classifying examinees in ordered levels when each level is represented by several multiple choice test items. In the modified Guttman procedure, within-level number correct scores are mapped to binary level mastery scores. Examinees are then assigned to levels…
Descriptors: Classification, Comparative Analysis, Item Response Theory, Mathematics Tests
Herman, William E. – 2000
This paper looked at the experience that college students have when taking exams. The sample population was made up of students completing their first exam of the semester. The study was designed to offer research-based support on how to help students improve their performance on multiple-choice exams. Student perceptions of the number of items…
Descriptors: Academic Achievement, College Students, Higher Education, Improvement
Peer reviewedRowley, Glenn L. – Journal of Educational Measurement, 1974
Descriptors: Achievement Tests, Anxiety, Educational Testing, Guessing (Tests)
Peer reviewedFrisbee, David A. – Journal of Educational Measurement, 1973
The purpose of this study was to gather empirical evidence to compare the reliabilities and concurrent validities of multiple choice and true-false tests that were written to measure understandings and relationships in the same content areas. (Author)
Descriptors: Achievement Tests, Correlation, High School Students, Measurement
Peer reviewedSchrand, Heinrich – Zielsprache Englisch, 1973
Descriptors: English (Second Language), Illustrations, Language Instruction, Language Tests
Peer reviewedWard, Charles D. – Journal of College Science Teaching, 1973
Descriptors: Academic Achievement, Multiple Choice Tests, Performance Factors, Psychology
Peer reviewedSchofield, R. – School Science Review, 1973
Discusses the position of examining boards and science teachers in England with respect to student guessing on objective type test items. The results of a questionnaire indicate that most science teachers advise their students to guess when they do not know the correct answer to an item. (JR)
Descriptors: Educational Research, Evaluation, Guessing (Tests), Multiple Choice Tests
Peer reviewedSopher, E. – English Language Teaching, 1973
Descriptors: Answer Keys, Comprehension, Grammar, Idioms
Peer reviewedBuckley-Sharp, M. D.; Harris, F. T. C. – International Journal of Mathematical Education in Science and Technology, 1972
Descriptors: Computer Programs, Educational Technology, Evaluation, Individualized Instruction
Peer reviewedBaker, Eva L. – Journal of Educational Measurement, 1971
The expected tendencies toward greater homogeneity among items produced under conditions employing behavioral objectives were not found. (Author/AG)
Descriptors: Behavioral Objectives, Correlation, Current Events, Curriculum Evaluation
Peer reviewedAkeju, S. A. – Journal of Educational Measurement, 1972
Study was an attempt to evaluate the West African Examinations Council efforts in terms of the extent to which its marking procedures have ensured high reader reliability for the English Language Essay examination, a test which was designed to measure writing ability. (Author)
Descriptors: Essay Tests, Examiners, Foreign Countries, Multiple Choice Tests


