Publication Date
| In 2026 | 0 |
| Since 2025 | 197 |
| Since 2022 (last 5 years) | 1067 |
| Since 2017 (last 10 years) | 2577 |
| Since 2007 (last 20 years) | 4938 |
Descriptor
Source
Author
Publication Type
Education Level
Audience
| Practitioners | 653 |
| Teachers | 563 |
| Researchers | 250 |
| Students | 201 |
| Administrators | 81 |
| Policymakers | 22 |
| Parents | 17 |
| Counselors | 8 |
| Community | 7 |
| Support Staff | 3 |
| Media Staff | 1 |
| More ▼ | |
Location
| Turkey | 225 |
| Canada | 223 |
| Australia | 155 |
| Germany | 116 |
| United States | 99 |
| China | 90 |
| Florida | 86 |
| Indonesia | 82 |
| Taiwan | 78 |
| United Kingdom | 73 |
| California | 65 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
| Meets WWC Standards without Reservations | 4 |
| Meets WWC Standards with or without Reservations | 4 |
| Does not meet standards | 1 |
Peer reviewedEnger, Rolf C. – Journal of College Science Teaching, 1981
Describes the purpose, construction, use, maintenance, and improvement of a basic physics examination file by the U.S. Air Force Academy. Includes nine sample objectives, two multiple-choice items, and an example of computer item analysis. (SK)
Descriptors: College Science, Computer Assisted Testing, Criterion Referenced Tests, Filing
Peer reviewedBarnett-Foster, Debora; Nagy, Philip – Higher Education, 1996
A study compared response strategies and error patterns of 272 college freshmen on chemistry test items in multiple choice and constructed response formats. Analysis of test data indicated no significant difference in solution strategies used or types of errors committed across test formats. However, interviews with 21 participants revealed…
Descriptors: Chemistry, College Freshmen, Comparative Analysis, Error Patterns
Peer reviewedRadocy, Rudolf E. – Music Educators Journal, 1989
Identifies the underlying concepts of student evaluation. Offers suggestions for evaluating musical achievement. Maintains that all evaluations are subjective, and suggests techniques for minimizing subjectivity. Considers various test formats, and discusses objectives for both classroom and performance achievement. (RW)
Descriptors: Academic Achievement, Elementary Secondary Education, Evaluation Criteria, Evaluation Problems
Kingsbury, G. Gage; And Others – Technological Horizons in Education, 1988
Explores what some deem the best way to objectively determine what a student knows. Adaptive Testing has been around since the early 1900's, but only with the advent of computers has it been effectively applied to day to day educational management. Cites a pilot study in Portland, Oregon, public schools. (MVL)
Descriptors: Administration, Computer Uses in Education, Diagnostic Teaching, Individual Needs
Peer reviewedWillis, John A. – Educational Measurement: Issues and Practice, 1990
The Learning Outcome Testing Program of the West Virginia Department of Education is designed to provide public school teachers/administrators with test questions matching learning outcomes. The approach, software selection, results of pilot tests with teachers in 13 sites, and development of test items for item banks are described. (SLD)
Descriptors: Classroom Techniques, Computer Assisted Testing, Computer Managed Instruction, Elementary Secondary Education
Peer reviewedDorton, Ian – Economics, 1989
Examines the organization of the extended project that is part of the General Certificate of Secondary Education (GCSE) A Level Business Studies examination. Provides a timetable for implementing the project. Includes student evaluations of the project. (LS)
Descriptors: Achievement Tests, Business Education, Economics, Economics Education
Peer reviewedBresnock, Anne E.; And Others – Journal of Economic Education, 1989
Investigates the effects on multiple choice test performance of altering the order and placement of questions and responses. Shows that changing the response pattern appears to alter significantly the apparent degree of difficulty. Response patterns become more dissimilar under certain types of response alterations. (LS)
Descriptors: Cheating, Economics Education, Educational Research, Grading
Peer reviewedLukhele, Robert; And Others – Journal of Educational Measurement, 1994
Fitting item response models to data from 2 Advanced Placement exams (18,462 and 82,842 students) demonstrates that constructed response items add little to information provided by multiple choice and that scoring on the basis of student item selection gives almost as much information as scoring on the basis of answers. (SLD)
Descriptors: Achievement Tests, Advanced Placement, Chemistry, Constructed Response
Peer reviewedCarpenter, Patricia A.; And Others – Psychological Review, 1990
Cognitive processes in the Raven Progressive Matrices Test, a nonverbal test of analytic intelligence, are analyzed in terms of processes distinguishing between high- and low-scoring students and processes common to all subjects and test items. Two experiments with 89 college students identify the abilities distinguishing among individuals. (SLD)
Descriptors: Ability, Cognitive Processes, College Students, Computer Simulation
Peer reviewedMislevy, Robert J.; And Others – Journal of Educational Measurement, 1992
Concepts behind plausible values in estimating population characteristics from sparse matrix samples of item responses are discussed. The use of marginal analyses is described in the context of the National Assessment of Educational Progress, and the approach is illustrated with Scholastic Aptitude Test data for 9,075 high school seniors. (SLD)
Descriptors: College Entrance Examinations, Educational Assessment, Equations (Mathematics), Estimation (Mathematics)
Peer reviewedMartinez, Michael E.; Bennett, Randy Elliot – Applied Measurement in Education, 1992
New developments in the use of automatically scorable constructed response item types for large-scale assessment are reviewed for five domains: (1) mathematical reasoning; (2) algebra problem solving; (3) computer science; (4) architecture; and (5) natural language. Ways in which these technologies are likely to shape testing are considered. (SLD)
Descriptors: Algebra, Architecture, Automation, Computer Science
Peer reviewedFowler, Robert L.; Clingman, Joy M. – Educational and Psychological Measurement, 1992
Monte Carlo techniques are used to examine the power of the "B" statistic of R. L. Brennan (1972) to detect negatively discriminating items drawn from a variety of nonnormal population distributions. A simplified procedure is offered for conducting an item-discrimination analysis on typical classroom objective tests. (SLD)
Descriptors: Classroom Techniques, Elementary Secondary Education, Equations (Mathematics), Item Analysis
Peer reviewedBraxton, John M. – Journal of Higher Education, 1993
A study investigated the relationship between undergraduate admissions selectivity at 40 research universities and academic rigor of course examination questions, as determined by the level of understanding required. Results suggest that more selective institutions do not provide more academically rigorous instruction than less selective ones.…
Descriptors: Academic Standards, Admission Criteria, College Admission, Comparative Analysis
Peer reviewedKerkman, Dennis D.; And Others – Teaching of Psychology, 1994
Reports on a study of 96 undergraduate developmental psychology students and their performance on student-developed "pop quizzes." Students who participated in writing test items had significantly higher scores than students who did not. Calls for more research into the effectiveness of other student-developed evaluation methods. (CFR)
Descriptors: Academic Achievement, Course Content, Educational Strategies, Higher Education
Peer reviewedBeller, Michael – Applied Psychological Measurement, 1990
Geometric approaches to representing interrelations among tests and items are compared with an additive tree model (ATM), using 2,644 examinees and 2 other data sets. The ATM's close fit to the data and its coherence of presentation indicate that it is the best means of representing tests and items. (TJH)
Descriptors: College Students, Comparative Analysis, Factor Analysis, Foreign Countries


