NotesFAQContact Us
Collection
Advanced
Search Tips
Showing 1 to 15 of 20 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Liu, Yuming; Robin, Frédéric; Yoo, Hanwook; Manna, Venessa – ETS Research Report Series, 2018
The "GRE"® Psychology test is an achievement test that measures core knowledge in 12 content domains that represent the courses commonly offered at the undergraduate level. Currently, a total score and 2 subscores, experimental and social, are reported to test takers as well as graduate institutions. However, the American Psychological…
Descriptors: College Entrance Examinations, Graduate Study, Psychological Testing, Scores
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Swiggett, Wanda D.; Kotloff, Laurie; Ezzo, Chelsea; Adler, Rachel; Oliveri, Maria Elena – ETS Research Report Series, 2014
The computer-based "Graduate Record Examinations"® ("GRE"®) revised General Test includes interactive item types and testing environment tools (e.g., test navigation, on-screen calculator, and help). How well do test takers understand these innovations? If test takers do not understand the new item types, these innovations may…
Descriptors: College Entrance Examinations, Graduate Study, Usability, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Albano, Anthony D. – Journal of Educational Measurement, 2013
In many testing programs it is assumed that the context or position in which an item is administered does not have a differential effect on examinee responses to the item. Violations of this assumption may bias item response theory estimates of item and person parameters. This study examines the potentially biasing effects of item position. A…
Descriptors: Test Items, Item Response Theory, Test Format, Questioning Techniques
Peer reviewed Peer reviewed
Ward, William C.; And Others – Journal of Educational Measurement, 1980
Free response and machine-scorable versions of a test called Formulating Hypotheses were compared with respect to construct validity. Results indicate that the different forms involve different cognitive processes and measure different qualities. (Author/JKS)
Descriptors: Cognitive Processes, Cognitive Tests, Higher Education, Personality Traits
Kingston, Neal M.; McKinley, Robert L. – 1988
Confirmatory multidimensional item response theory (CMIRT) was used to assess the structure of the Graduate Record Examination General Test, about which much information about factorial structure exists, using a sample of 1,001 psychology majors taking the test in 1984 or 1985. Results supported previous findings that, for this population, there…
Descriptors: College Students, Factor Analysis, Higher Education, Item Analysis
Powers, Donald E.; And Others – 1978
Much of the effort involved in a major restructuring of the Graduate Record Examinations (GRE) Aptitude Test was intended to result in the creation of an analytical module to supplement the verbal and quantitative sections of the test, thus providing broadened measurement. Factor extension analysis was used in the present study to investigate…
Descriptors: College Entrance Examinations, Factor Analysis, Factor Structure, Graduate Study
Peer reviewed Peer reviewed
Vogel, Lora Ann – Journal of Educational Computing Research, 1994
Reports on a study conducted to evaluate how individual differences in anxiety levels affect performance on computer versus paper-and-pencil forms of verbal sections of the Graduate Record Examination. Contrary to the research hypothesis, analysis of scores revealed that extroverted and less computer anxious subjects scored significantly lower on…
Descriptors: Comparative Analysis, Computer Anxiety, Computer Assisted Testing, Computer Attitudes
Peer reviewed Peer reviewed
Swinton, Spencer S.; Powers, Donald E. – Journal of Educational Psychology, 1983
A special preparation curriculum for the analytical section of the Graduate Record Examinations (GRE) Aptitude Test was developed and administered to self-selected GRE candidates. Analyses revealed an effect that stemmed from improved performance on two of the three analytical item types formerly included in the analytical section. (Author/PN)
Descriptors: College Entrance Examinations, Higher Education, Intentional Learning, Predictive Measurement
Kobrin, Jennifer L. – 2000
The comparability of computerized and paper-and-pencil tests was examined from cognitive perspective, using verbal protocols rather than psychometric methods, as the primary mode of inquiry. Reading comprehension items from the Graduate Record Examinations were completed by 48 college juniors and seniors, half of whom took the computerized test…
Descriptors: Cognitive Processes, College Students, Computer Assisted Testing, Higher Education
Kingston, Neal; Turner, Nancy – 1984
This investigation examines the impact the l98l Graduate Record Examination (GRE) General Test Format Revision had on the stability over time of the verbal, quantitative, and analytical scores. Scores were used from the self-selected group of repeaters who took the GRE General Test twice between October 1980 and June 1982. Examinees were divided…
Descriptors: College Entrance Examinations, Graduate Study, Higher Education, Multiple Regression Analysis
Wild, Cheryl L.; And Others – 1982
The research leading to the decisions to revise the Graduate Record Examination Aptitude Test (GRE) (beginning in October 1981) is reviewed. The issues discussed include the format of the test (the timing of each section and the number of sections, the content of the sections--especially the analytical section), the scoring procedure for the GRE,…
Descriptors: Aptitude Tests, College Entrance Examinations, Equated Scores, Graduate Study
Schaeffer, Gary A.; And Others – 1993
This report contains results of a field test conducted to determine the relationship between a Graduate Records Examination (GRE) linear computer-based test (CBT) and a paper-and-pencil (P&P) test with the same items. Recent GRE examinees participated in the field test by taking either a CBT or the P&P test. Data from the field test…
Descriptors: Attitudes, College Graduates, Computer Assisted Testing, Equated Scores
Bennett, Randy Elliot; And Others – 1991
This study investigated the convergent validity of expert-system scores for four mathematical constructed-response item formats. A five-factor model was proposed comprised of four constructed-response format factors and a Graduate Record Examinations (GRE) General Test quantitative factor. Subjects were drawn from examinees taking a single form of…
Descriptors: College Students, Constructed Response, Correlation, Expert Systems
Wild, Cheryl; Durso, Robin – 1979
This study investigates the effects of increasing the test time to reduce the speededness of the verbal and quantitative experimental sections of the Graduate Record Examinations (GRE) Aptitude Test. In December 1976, at approximately 550 domestic test centers, 20- and 30-minute versions of a verbal experimental test and of a quantitative…
Descriptors: College Entrance Examinations, Higher Education, Quantitative Tests, Racial Bias
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Gu, Lixiong; Drake, Samuel; Wolfe, Edward W. – Journal of Technology, Learning, and Assessment, 2006
This study seeks to determine whether item features are related to observed differences in item difficulty (DIF) between computer- and paper-based test delivery media. Examinees responded to 60 quantitative items similar to those found on the GRE general test in either a computer-based or paper-based medium. Thirty-eight percent of the items were…
Descriptors: Test Bias, Test Items, Educational Testing, Student Evaluation
Previous Page | Next Page »
Pages: 1  |  2