Publication Date
| In 2026 | 0 |
| Since 2025 | 75 |
| Since 2022 (last 5 years) | 510 |
| Since 2017 (last 10 years) | 1085 |
| Since 2007 (last 20 years) | 2604 |
Descriptor
Source
Author
Publication Type
Education Level
Audience
| Researchers | 169 |
| Practitioners | 49 |
| Teachers | 32 |
| Administrators | 8 |
| Policymakers | 8 |
| Counselors | 4 |
| Students | 4 |
| Media Staff | 1 |
Location
| Turkey | 174 |
| Australia | 81 |
| Canada | 79 |
| China | 72 |
| United States | 56 |
| Taiwan | 44 |
| Germany | 43 |
| Japan | 41 |
| United Kingdom | 39 |
| Iran | 37 |
| Indonesia | 35 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
| Meets WWC Standards without Reservations | 1 |
| Meets WWC Standards with or without Reservations | 1 |
| Does not meet standards | 1 |
Van Petegem, Peter; Deneire, Alexia; De Maeyer, Sven – Studies in Educational Evaluation, 2008
This paper describes the validation of a self-evaluation instrument for teachers in secondary education to solicit feedback from their pupils regarding specific aspects of the teacher's practice in class. This 92-item instrument--Teachers Learn from Pupils-Secondary Education (TLP-SE)--assesses 10 relevant classroom environment dimensions:…
Descriptors: Feedback (Response), Foreign Countries, Psychometrics, Classroom Environment
Phelan, Julia; Choi, Kilchan; Vendlinski, Terry; Baker, Eva L.; Herman, Joan L. – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2009
This report describes results from field-testing of POWERSOURCE[C] formative assessment alongside professional development and instructional resources. The researchers at the National Center for Research, on Evaluation, Standards, & Student Testing (CRESST) employed a randomized, controlled design to address the following question: Does the…
Descriptors: Mathematical Concepts, Middle School Students, Grade 6, Educational Strategies
Gierl, Mark J.; Leighton, Jacqueline P.; Wang, Changjiang; Zhou, Jiawen; Gokiert, Rebecca; Tan, Adele – College Board, 2009
The purpose of the study is to present research focused on validating the four algebra cognitive models in Gierl, Wang, et al., using student response data collected with protocol analysis methods to evaluate the knowledge structures and processing skills used by a sample of SAT test takers.
Descriptors: Algebra, Mathematics Tests, College Entrance Examinations, Student Attitudes
Peer reviewedHofmann, Richard J. – Educational and Psychological Measurement, 1975
A new item analysis index, e, is derived as a function of difficulty and discrimination to represent item efficiency. (Author/RC)
Descriptors: Item Analysis, Probability, Response Style (Tests), Statistical Analysis
Peer reviewedKrus, David J.; Bart, William M. – Educational and Psychological Measurement, 1974
Descriptors: Item Analysis, Matrices, Multidimensional Scaling, Response Style (Tests)
Peer reviewedSainty, Geoffrey E. – Journal of Vocational Behavior, 1974
An empirical validation of the 114 Worker Trait Groups of the Dictionary of Occupational Titles was performed by comparing the factor structure of the worker trait components of the 114 WTG's with the factor structure of a random sample of 800 of the 4000 jobs used as the basis for DOT. (Author)
Descriptors: Employment, Item Analysis, Occupations, Test Theory
Dunn-Rankin, Peter; King, F. J. – Educ Psychol Meas, 1969
Descriptors: Evaluation Methods, Item Analysis, Scaling, Statistical Significance
Irwin, Tom J. – 1968
The program briefly described in this paper represents an attempt to have the computer provide the counselor with a descriptive, item interpretation of the Edwards Personal Preference Schedule (EPPS). The rationale of the item analysis approach to a descriptive interpretation is that each of the 135 statements (nine for each of the 15 EPPS scales)…
Descriptors: Career Counseling, Computers, Counseling, Item Analysis
Sells, S.B.; And Others – 1968
The present investigation involves 600 personality questionnaire items. The 300 Guilford items comprise 78 marker clusters for 15 Guilford factors; the 300 Cattell items represent marker items for 17 Cattell factors. The study involved two major analyses. In the first, the 600 x 600 matrix was factor analyzed by the Principal Factor Method,…
Descriptors: Adults, Factor Analysis, Item Analysis, Personality Assessment
Blumenfeld, Warren S.; And Others – San Diego Convention Abstracts, 1973
This is a psychometric hoax paper, the purpose of which is to indicate once again the importance of cross-validation, particularly in the development of specially-keyed inventories. The junior author and the new psychometric method play critical roles in the study. Appropriate credit and references are present. (Author)
Descriptors: Answer Keys, Item Analysis, Psychometrics, Scoring
Forster, Fred – 1976
Various factors which influence the relationship between the Rasch item characteristic curve and the actual performance of an item are identified. The Rasch item characteristic curve is a new concept in test design and analysis. The Rasch test model provides information concerning the percent of students with a specified achievement level who…
Descriptors: Goodness of Fit, Item Analysis, Mathematical Models, Probability
Peer reviewedColonius, Hans – Psychometrika, 1977
Parameter estimation for Keats generalization of the Rasch model that takes account of guessing behavior is investigated. It is shown that no minimal sufficient statistics for the ability parameters independent of the difficulty parameters exist. (Author/JKS)
Descriptors: Guessing (Tests), Item Analysis, Test Construction, Test Reliability
Peer reviewedCallender, John C.; Osburn, H. G. – Educational and Psychological Measurement, 1977
A FORTRAN program for maximizing and cross-validating split-half reliability coefficients is described. Externally computed arrays of item means and covariances are used as input for each of two samples. The user may select a number of subsets from the complete set of items for analysis in a single run. (Author/JKS)
Descriptors: Computer Programs, Item Analysis, Test Reliability, Test Validity
Peer reviewedCooper, Merri-Ann; Fiske, Donald W. – Educational and Psychological Measurement, 1976
Construct validity patterns of test-criteria and item-criteria correlations are shown to be inconsistent across samples. The results of an investigation of construct validity patterns on two published personality scales is presented. (JKS)
Descriptors: Correlation, Item Analysis, Personality Measures, Reliability
French, Christine L. – 2001
Item analysis is a very important consideration in the test development process. It is a statistical procedure to analyze test items that combines methods used to evaluate the important characteristics of test items, such as difficulty, discrimination, and distractibility of the items in a test. This paper reviews some of the classical methods for…
Descriptors: Item Analysis, Item Response Theory, Selection, Test Items

Direct link
