Descriptor
Author
| Bridgeman, Brent | 2 |
| Drake, Samuel | 1 |
| Freedle, Roy | 1 |
| Gerritz, Kalle | 1 |
| Gu, Lixiong | 1 |
| Kostin, Irene | 1 |
| Rock, Donald A. | 1 |
| Scheuneman, Janice Dowd | 1 |
| Wolfe, Edward W. | 1 |
Publication Type
| Journal Articles | 5 |
| Reports - Research | 5 |
Education Level
| Higher Education | 1 |
| Postsecondary Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
| Graduate Record Examinations | 5 |
| SAT (College Admission Test) | 2 |
What Works Clearinghouse Rating
Peer reviewedBridgeman, Brent; Rock, Donald A. – Journal of Educational Measurement, 1993
Exploratory and confirmatory factor analyses were used to explore relationships among existing item types and three new computer-administered item types for the analytical scale of the Graduate Record Examination General Test. Results with 349 students indicate constructs the item types are measuring. (SLD)
Descriptors: College Entrance Examinations, College Students, Comparative Testing, Computer Assisted Testing
Peer reviewedFreedle, Roy; Kostin, Irene – Journal of Educational Measurement, 1990
The importance of item difficulty (equated delta) was explored as a predictor of differential item functioning of Black versus White examinees for 4 verbal item types using 13 Graduate Record Examination forms and 11 Scholastic Aptitude Test forms. Several significant racial differences were found. (TJH)
Descriptors: Black Students, College Bound Students, College Entrance Examinations, Comparative Testing
Gu, Lixiong; Drake, Samuel; Wolfe, Edward W. – Journal of Technology, Learning, and Assessment, 2006
This study seeks to determine whether item features are related to observed differences in item difficulty (DIF) between computer- and paper-based test delivery media. Examinees responded to 60 quantitative items similar to those found on the GRE general test in either a computer-based or paper-based medium. Thirty-eight percent of the items were…
Descriptors: Test Bias, Test Items, Educational Testing, Student Evaluation
Peer reviewedBridgeman, Brent – Journal of Educational Measurement, 1992
Examinees in a regular administration of the quantitative portion of the Graduate Record Examination responded to particular items in a machine-scannable multiple-choice format. Volunteers (n=364) used a computer to answer open-ended counterparts of these items. Scores for both formats demonstrated similar correlational patterns. (SLD)
Descriptors: Answer Sheets, College Entrance Examinations, College Students, Comparative Testing
Peer reviewedScheuneman, Janice Dowd; Gerritz, Kalle – Journal of Educational Measurement, 1990
Differential item functioning (DIF) methodology for revealing sources of item difficulty and performance characteristics of different groups was explored. A total of 150 Scholastic Aptitude Test items and 132 Graduate Record Examination general test items were analyzed. DIF was evaluated for males and females and Blacks and Whites. (SLD)
Descriptors: Black Students, College Entrance Examinations, College Students, Comparative Testing


