NotesFAQContact Us
Collection
Advanced
Search Tips
Showing 2,626 to 2,640 of 5,131 results Save | Export
Durovic, Jerry J. – 1975
A test bias definition, applicable at the item-level of a test is presented. The definition conceptually equates test bias with measuring different things in different groups, and operationally equates test bias with a difference in item fit to the Rasch Model, greater than one, between groups. It is suggested that the proposed definition avoids…
Descriptors: Content Analysis, Definitions, Item Analysis, Mathematical Models
Peer reviewed Peer reviewed
Krus, David J.; Ney, Robert G. – Educational and Psychological Measurement, 1978
An algorithm for item analysis in which item discrimination indices have been defined for the distractors as well as the correct answer is presented. Also, the concept of convergent and discriminant validity is applied to items instead of tests, and is discussed as an aid to item analysis. (Author/JKS)
Descriptors: Algorithms, Item Analysis, Multiple Choice Tests, Test Items
Peer reviewed Peer reviewed
Pine, Steven M.; Wattawa, Scott – Educational and Psychological Measurement, 1978
A computer program for a comparative evaluation of the extent of item bias between two subgroups in a test population is described. The program calculates an index of bias based on Angoff's elliptical distance measure, and provides statistics for determining the similarity of intergroup item parameters. (Author/JKS)
Descriptors: Comparative Analysis, Computer Programs, Item Analysis, Test Bias
Peer reviewed Peer reviewed
Tzeng, Oliver C. S.; Landis, Dan – Multivariate Behavioral Research, 1978
Two popular models for performing multidimensional scaling, Tucker and Messick's points-of-view model, and Tucker's three mode model, are combined into a single analytic procedure, the 3M-POV model. The procedure is described and its strengths are discussed. Carroll and Chang's INDSCAL model is also mentioned. (JKS)
Descriptors: Correlation, Item Analysis, Mathematical Models, Multidimensional Scaling
Peer reviewed Peer reviewed
Green, Samual B.; And Others – Educational and Psychological Measurement, 1977
Confusion in the literature between the concepts of internal consistency and homogeneity has led to a misuse of coefficient alpha as an index of item homogeneity. This misuse is discussed and several indices of item homogeneity derived from the model of common factor analysis are offered as alternatives. (Author/JKS)
Descriptors: Factor Analysis, Item Analysis, Test Interpretation, Test Items
Peer reviewed Peer reviewed
Johns, Jerry L. – Journal of Reading, 1978
Research indicates that many questions on reading comprehension tests can be answered from previous knowledge and are not dependent upon the passages in the tests. (MKM)
Descriptors: Item Analysis, Reading Comprehension, Reading Research, Reading Tests
Peer reviewed Peer reviewed
Callahan, Leroy G. – Arithmetic Teacher, 1977
Some speculations regarding the response tendencies of a group of first graders are examined for ten test items. (JT)
Descriptors: Elementary Education, Elementary School Mathematics, Instruction, Item Analysis
Peer reviewed Peer reviewed
Samejima, Fumiko – Psychometrika, 1977
A method of estimating item characteristic functions is proposed, in which a set of test items, whose operating characteristics are known and which give a constant test information function for a wide range of ability, are used. The method is based on maximum likelihood estimation procedures. (Author/JKS)
Descriptors: Item Analysis, Latent Trait Theory, Measurement, Test Construction
Hayes, Leola G. – Pointer, 1976
Outlined is a method for teaching reading through naming and reviewing the names of various foods. (SBH)
Descriptors: Disabilities, Food, General Education, Item Analysis
Peer reviewed Peer reviewed
Masters, Geofferey N. – Journal of Educational Measurement, 1988
High item discrimination can indicate a special kind of measurement disturbance via an item that gives high-ability persons a special advantage. The measurement disturbance is described, which occurs when an item is sensitive to individual differences on a second, undesired dimension that is correlated with the variable intended to be measured.…
Descriptors: Academically Gifted, Item Analysis, Test Bias, Test Wiseness
Peer reviewed Peer reviewed
Hovanitz, Christine A.; And Others – Journal of Clinical Psychology, 1985
Evaluated the hypothesis that a response set accounts for relationships between obvious or subtle statements and criteria by assessing the discriminant validity of two Minnesota Multiphasic Personality Inventory (MMPI) scales subdivided on the obvious-subtle dimension. The full scales appear to possess more discriminant validity than their obvious…
Descriptors: College Students, Discriminant Analysis, Higher Education, Item Analysis
Peer reviewed Peer reviewed
Miller, Harold R.; Streiner, David L. – Journal of Clinical Psychology, 1985
Examined the subjectivity of the Harris-Lingoes Minnesota Multiphasic Personality Inventory (MMPI) content subscales by asking expert judges (N=13) to group items from appropriate clinical scales that represented similar content, attitudes or traits. Results showed nine replicated subscales were highly similar and nine were moderately similar to…
Descriptors: Foreign Countries, Item Analysis, Personality Traits, Psychologists
Peer reviewed Peer reviewed
O'Grady, Kevin E. – Journal of Consulting and Clinical Psychology, 1983
Analyzed the intercorrelations among the 11 subtests of the Wechsler Adult Intelligence Scale-Revised (WAIS-R) in the nine age groups in the normative sample. Results suggested that the factor structure underlying the WAIS-R is complex and that a large proportion of WAIS-R performance can be explained by a general intellectual factor. (LLL)
Descriptors: Cognitive Ability, Factor Analysis, Intelligence Tests, Item Analysis
Peer reviewed Peer reviewed
Klosner, Naomi Certner; Gellman, Estelle Klittnick – Educational and Psychological Measurement, 1973
Study compared three item arrangements on a classroom test: arrangement by subject matter, arrangement by ascending order of difficulty, and arrangement by ascending order of difficulty within subject-matter subtests. (Authors/CB)
Descriptors: Achievement Tests, Item Analysis, Performance Factors, Tables (Data)
Peer reviewed Peer reviewed
Wahlstom, M. W.; Hunka, S. M. – Journal of Experimental Education, 1972
An algorithm, based upon factor analytic theory, is presented to illustrate one solution to the problem of selecting items for a test. (Authors)
Descriptors: Algorithms, Criteria, Factor Analysis, Item Analysis
Pages: 1  |  ...  |  172  |  173  |  174  |  175  |  176  |  177  |  178  |  179  |  180  |  ...  |  343