NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 7 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Attali, Yigal – ETS Research Report Series, 2014
Previous research on calculator use in standardized assessments of quantitative ability focused on the effect of calculator availability on item difficulty and on whether test developers can predict these effects. With the introduction of an on-screen calculator on the Quantitative Reasoning measure of the "GRE"® revised General Test, it…
Descriptors: College Entrance Examinations, Graduate Study, Calculators, Test Items
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Sheehan, Kathleen M.; Kostin, Irene; Futagi, Yoko – ETS Research Report Series, 2007
This paper explores alternative approaches for facilitating efficient, evidence-centered item development for a new type of verbal reasoning item developed for use on the GRE® General Test. Results obtained in two separate studies are reported. The first study documented the development and validation of a fully automated approach for locating the…
Descriptors: College Entrance Examinations, Graduate Study, Test Items, Item Analysis
Peer reviewed Peer reviewed
Chalifour, Clark L.; Powers, Donald E. – Journal of Educational Measurement, 1989
Content characteristics of 1,400 Graduate Record Examination (GRE) analytical reasoning items were coded for item difficulty and discrimination. The results provide content characteristics for consideration in extending specifications for analytical reasoning items and a better understanding of the construct validity of these items. (TJH)
Descriptors: College Entrance Examinations, Construct Validity, Content Analysis, Difficulty Level
Peer reviewed Peer reviewed
Direct linkDirect link
Gorin, Joanna S.; Embretson, Susan E. – Applied Psychological Measurement, 2006
Recent assessment research joining cognitive psychology and psychometric theory has introduced a new technology, item generation. In algorithmic item generation, items are systematically created based on specific combinations of features that underlie the processing required to correctly solve a problem. Reading comprehension items have been more…
Descriptors: Difficulty Level, Test Items, Modeling (Psychology), Paragraph Composition
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Sinharay, Sandip; Johnson, Matthew – ETS Research Report Series, 2005
"Item models" (LaDuca, Staples, Templeton, & Holzman, 1986) are classes from which it is possible to generate/produce items that are equivalent/isomorphic to other items from the same model (e.g., Bejar, 1996; Bejar, 2002). They have the potential to produce large number of high-quality items at reduced cost. This paper introduces…
Descriptors: Item Analysis, Test Items, Scoring, Psychometrics
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Graf, Edith Aurora; Peterson, Stephen; Steffen, Manfred; Lawless, René – ETS Research Report Series, 2005
We describe the item modeling development and evaluation process as applied to a quantitative assessment with high-stakes outcomes. In addition to expediting the item-creation process, a model-based approach may reduce pretesting costs, if the difficulty and discrimination of model-generated items may be predicted to a predefined level of…
Descriptors: Psychometrics, Accuracy, Item Analysis, High Stakes Tests
Kingston, Neal M. – 1985
Birnbaum's three-parameter logistic item response model was used to study guessing behavior of low ability examinees on the Graduate Record Examinations (GRE) General Test, Verbal Measure. GRE scoring procedures had recently changed, from a scoring formula which corrected for guessing, to number-right scoring. The three-parameter theory was used…
Descriptors: Academic Aptitude, Analysis of Variance, College Entrance Examinations, Difficulty Level