Publication Date
In 2025 | 42 |
Since 2024 | 165 |
Since 2021 (last 5 years) | 588 |
Since 2016 (last 10 years) | 1225 |
Since 2006 (last 20 years) | 2731 |
Descriptor
Source
Author
Publication Type
Education Level
Audience
Researchers | 169 |
Practitioners | 49 |
Teachers | 32 |
Administrators | 8 |
Policymakers | 8 |
Counselors | 4 |
Students | 4 |
Media Staff | 1 |
Location
Turkey | 172 |
Australia | 81 |
Canada | 79 |
China | 70 |
United States | 55 |
Germany | 43 |
Taiwan | 43 |
Japan | 40 |
United Kingdom | 38 |
Iran | 36 |
Spain | 33 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Meets WWC Standards without Reservations | 1 |
Meets WWC Standards with or without Reservations | 1 |
Does not meet standards | 1 |

Dudycha, Arthur L.; Carpenter, James B. – Journal of Applied Psychology, 1973
In this study, three structural characteristics--stem format, inclusive versus specific distracters, and stem orientation--were selected for experimental manipulation, while the number of alternatives, the number of correct answers, and the order of items were experimentally controlled. (Author)
Descriptors: Discriminant Analysis, Item Analysis, Multiple Choice Tests, Test Construction

Dreger, Ralph Mason – Educational and Psychological Measurement, 1973
Study refers to J. A. Bowers' A note on Gaylord's Estimating test reliability from the item-test correlations,''' EJ 041 295. (CB)
Descriptors: Correlation, Item Analysis, Mathematical Applications, Statistical Analysis
Miller, Marvin M. – NASSP Bull, 1970
Computers can be used to score tests in order to reduce the workload of the teacher. (CK)
Descriptors: Computers, Item Analysis, Test Interpretation, Test Scoring Machines

Lee, Young B.; And Others – Educational and Psychological Measurement, 1972
Descriptors: Computer Programs, Data Analysis, Item Analysis, Multiple Choice Tests

Kohr, Richard L. – Educational and Psychological Measurement, 1971
Descriptors: Attitude Measures, Computer Programs, Item Analysis, Rating Scales
Webster, William J.; McLeod, Gordon K. – Educ, 1970
A system designed to plan and empirically validate an advanced program for elementary and secondary education is presented. (CK)
Descriptors: Curriculum Design, Educational Objectives, Educational Programs, Item Analysis

Dujovne, Beatriz E.; Levy, Bernard I. – Journal of Clinical Psychology, 1971
Descriptors: Cognitive Processes, Item Analysis, Memory, Rating Scales

Clark, William H.; Margolis, Bruce L. – Educational and Psychological Measurement, 1971
Descriptors: Biographical Inventories, Data Analysis, Item Analysis, Scoring

Fischer, Frederick E. – Journal of Educational Measurement, 1970
The personalbiserial index is a correlation which measures the relationship between the difficulty of the items in a test for the person, as evidenced by this passes and failures, and the difficulty of the items as evidenced by group-determined item difficulties. Reliability and predictive validity are studiesstudied. (Author/RF)
Descriptors: Guessing (Tests), Item Analysis, Predictive Measurement, Predictor Variables

Menne, John W.; Tolsma, Robert J. – Journal of Educational Measurement, 1971
Descriptors: Discriminant Analysis, Group Testing, Item Analysis, Psychometrics
Begley, Carl E.; and others – J Clin Psychol, 1970
The results of a questionnaire suggested that therapists and patients do not view therapy within the same frame of reference. (CK)
Descriptors: Attitudes, Cluster Grouping, Item Analysis, Patients

Ingils, Chester R. – Clearing House, 1970
Evaluation of the attainment of educational objectives and the behavioral development of the learner would be sufficient evaluation of the qualities of the teaching he received. (CK)
Descriptors: Administrators, Educational Objectives, Evaluation Criteria, Item Analysis

Rozeboom, William W. – Psychometrika, 1982
Bounds for the multiple correlation of common factors with the items which comprise those factors are developed. It is then shown that under broad, but not completely general, conditions, the circumstances under which an infinite item domain does or does not perfectly determine selected subsets of its common factors. (Author/JKS)
Descriptors: Factor Analysis, Item Analysis, Multiple Regression Analysis, Test Items

Cressie, Noel; Holland, Paul W. – Psychometrika, 1983
The problem of characterizing the manifest probabilities of a latent trait model is considered. The approach taken here differs from the standard approach in that a population of examinees is being considered as opposed to a single examinee. Particular attention is given to the Rasch model. (Author/JKS)
Descriptors: Guessing (Tests), Item Analysis, Latent Trait Theory, Mathematical Models

Melzer, Charles W.; And Others – Educational and Psychological Measurement, 1981
The magnitude of statistical bias for the phi-coefficient was investigated, using computer simulated examinations in which all the students had equal knowledge. Several modifications of phi were tested, but when applied to real examinations, none succeeded in improving its reproducibility when items are re-used on equivalent student groups.…
Descriptors: Correlation, Item Analysis, Mathematical Models, Multiple Choice Tests