Publication Date
| In 2026 | 0 |
| Since 2025 | 75 |
| Since 2022 (last 5 years) | 510 |
| Since 2017 (last 10 years) | 1085 |
| Since 2007 (last 20 years) | 2604 |
Descriptor
Source
Author
Publication Type
Education Level
Audience
| Researchers | 169 |
| Practitioners | 49 |
| Teachers | 32 |
| Administrators | 8 |
| Policymakers | 8 |
| Counselors | 4 |
| Students | 4 |
| Media Staff | 1 |
Location
| Turkey | 174 |
| Australia | 81 |
| Canada | 79 |
| China | 72 |
| United States | 56 |
| Taiwan | 44 |
| Germany | 43 |
| Japan | 41 |
| United Kingdom | 39 |
| Iran | 37 |
| Indonesia | 35 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
| Meets WWC Standards without Reservations | 1 |
| Meets WWC Standards with or without Reservations | 1 |
| Does not meet standards | 1 |
Peer reviewedMcDonald, Roderick P. – Educational and Psychological Measurement, 1978
It is shown that if a behavior domain can be described by the common factor model with a finite number of factors, the squared correlation between the sum of a selection of items and the domain total score is actually greater than coefficient alpha. (Author/JKS)
Descriptors: Factor Analysis, Item Analysis, Mathematical Models, Measurement
Peer reviewedRozeboom, William W. – Educational and Psychological Measurement, 1978
A strict equivalence presupposed by Kaiser and Michael to derive the coefficient of "domain validity" is defensible only as a biased approximation. But then, it is far from clear what psychometric significance this coefficient has in the first place. (Author)
Descriptors: Criterion Referenced Tests, Item Analysis, Item Banks, Test Validity
Peer reviewedShye, Samuel – Multivariate Behavioral Research, 1978
Facet technique is used to analyze contents of achievement motive questionnaire items proposed in recent years and to construct an explicit definition for achievement motive. (Author/JKS)
Descriptors: Academic Achievement, Achievement Need, Item Analysis, Motivation
Peer reviewedVegelius, Jan – Educational and Psychological Measurement, 1977
The G index of agreement does not permit the use of various weights for its various items. The weighted G index described here, make it possible to use unequal weights. An example of the procedure is provided. (Author/JKS)
Descriptors: Correlation, Item Analysis, Multidimensional Scaling, Test Items
Peer reviewedSamejima, Fumiko – Psychometrika, 1977
A new concept of weakly parallel tests, in contrast to strongly parallel tests in latent trait theory, is proposed. Some criticisms of the fundamental concepts in classical test theory, such as the reliability of a test and the standard error of estimation, are given. (Author)
Descriptors: Item Analysis, Latent Trait Theory, Measurement, Test Construction
Peer reviewedKestenbaum, Joel M. – Journal of Personality Assessment, 1976
Subjects rated each item in Rotter's I-E Scale for its social desirability value. Social desirability scale values (SDSV) of paired items were compared with one another. Results indicate that paired items are not similar in their SDSV, thus enabling subjects to respond on the basis of social desirability. (Author/DEP)
Descriptors: Item Analysis, Locus of Control, Personality Measures, Social Values
Peer reviewedThorne, Frederick C. – Journal of Clinical Psychology, 1977
The Personal Development Study (PDS) is one of the eight subtests of the Integration Level Test Series, which objectively measures different hierarchical levels of factors that organize the integration of psychological states. The PDS consists of a 200-item questionnaire devised to discover whether Freudian mechanisms were operating across a wide…
Descriptors: Factor Analysis, Item Analysis, Measurement Instruments, Psychological Studies
Peer reviewedCook, Linda L.; And Others – Journal of Educational Statistics, 1988
First- and second-order factor analyses were conducted, using the LISREL model, on correlation matrices among item parcels of verbal items of the Scholastic Aptitude Test. Focus was on determining whether statistical dependence among item scores can be explained by a single ability dimension. Results suggest future research possibilities related…
Descriptors: Factor Analysis, Item Analysis, Latent Trait Theory, Verbal Tests
Peer reviewedDowney, Ronald G.; Stockdale, Margaret S. – Educational and Psychological Measurement, 1987
Lord's method for detecting subgroup bias at the item level uses a three-parameter item characteristic curve model. A chi square statistic is computed on the multivariate differences between the parameter estimates of item discrimination and difficulty. The LOGIST program and additional programs written in BASIC are used. (Author/GDC)
Descriptors: Computer Software, Item Analysis, Latent Trait Theory, Statistical Bias
Peer reviewedWainer, Howard – Journal of Educational Measurement, 1986
An example demonstrates and explains that summary statistics commonly used to measure test quality can be seriously misleading and that summary statistics for the whole test are not sufficient for judging the quality of the test. (Author/LMO)
Descriptors: Correlation, Item Analysis, Statistical Bias, Statistical Studies
Peer reviewedJensema, Carl – Educational and Psychological Measurement, 1976
A simple and economical method for estimating initial parameter values for the normal ogive or logistic latent trait mental test model is outlined. The accuracy of the method in comparison with maximum likelihood estimation is investigated through the use of Monte-Carlo data. (Author)
Descriptors: Guessing (Tests), Item Analysis, Latent Trait Theory, Measurement Techniques
Smith, Everett V., Jr.; Brown, Scott W.; Silver, Bethany B.; Garry, Maryanne; Loftus, Elizabeth – 1998
Beliefs about memories and about the ability to recall memories may affect the individual's recollection of facts and "events." What people believe about memory is investigated using previously analyzed and reported responses to the Beliefs about Memory Survey (BMS). The survey sample (N=1046) was randomly divided for calibration and…
Descriptors: Factor Analysis, Item Analysis, Memory, Metacognition
Peer reviewedDudycha, Arthur L.; Carpenter, James B. – Journal of Applied Psychology, 1973
In this study, three structural characteristics--stem format, inclusive versus specific distracters, and stem orientation--were selected for experimental manipulation, while the number of alternatives, the number of correct answers, and the order of items were experimentally controlled. (Author)
Descriptors: Discriminant Analysis, Item Analysis, Multiple Choice Tests, Test Construction
Peer reviewedDreger, Ralph Mason – Educational and Psychological Measurement, 1973
Study refers to J. A. Bowers' A note on Gaylord's Estimating test reliability from the item-test correlations,''' EJ 041 295. (CB)
Descriptors: Correlation, Item Analysis, Mathematical Applications, Statistical Analysis
Miller, Marvin M. – NASSP Bull, 1970
Computers can be used to score tests in order to reduce the workload of the teacher. (CK)
Descriptors: Computers, Item Analysis, Test Interpretation, Test Scoring Machines


