Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 0 |
| Since 2007 (last 20 years) | 1 |
Descriptor
| Test Content | 7 |
| Graduate Study | 5 |
| College Entrance Examinations | 4 |
| Higher Education | 4 |
| Test Construction | 4 |
| Test Format | 3 |
| Test Items | 3 |
| Comparative Analysis | 2 |
| Computer Assisted Testing | 2 |
| Difficulty Level | 2 |
| Scoring | 2 |
| More ▼ | |
Source
| CEA Forum | 1 |
| ETS Research Report Series | 1 |
| History Teacher | 1 |
| Journal of Technology,… | 1 |
| Teaching of Psychology | 1 |
Author
| Brown, Kevin | 1 |
| Chalifour, Clark | 1 |
| Drake, Samuel | 1 |
| Graf, Edith Aurora | 1 |
| Gu, Lixiong | 1 |
| Kalat, James W. | 1 |
| Lawless, René | 1 |
| Matlin, Margaret W. | 1 |
| Monahan, Thomas C. | 1 |
| Peterson, Stephen | 1 |
| Powers, Donald E. | 1 |
| More ▼ | |
Publication Type
| Journal Articles | 5 |
| Reports - Descriptive | 3 |
| Reports - Evaluative | 2 |
| Reports - Research | 2 |
Education Level
| Higher Education | 3 |
| Postsecondary Education | 3 |
Audience
| Administrators | 1 |
| Practitioners | 1 |
| Teachers | 1 |
Location
Laws, Policies, & Programs
Assessments and Surveys
| Graduate Record Examinations | 7 |
| Praxis Series | 1 |
What Works Clearinghouse Rating
Brown, Kevin – CEA Forum, 2015
In this article, the author describes his project to take every standardized exam English majors students take. During the summer and fall semesters of 2012, the author signed up for and took the GRE General Test, the Praxis Content Area Exam (English Language, Literature, and Composition: Content Knowledge), the Senior Major Field Tests in…
Descriptors: College Faculty, College English, Test Preparation, Standardized Tests
Content Characteristics of GRE Analytical Reasoning Items. GRE Board Professional Report No. 84-14P.
Chalifour, Clark; Powers, Donald E. – 1988
In actual test development practice, the number of test items that must be developed and pretested is typically greater, and sometimes much greater, than the number eventually judged suitable for use in operational test forms. This has proven to be especially true for analytical reasoning items, which currently form the bulk of the analytical…
Descriptors: Coding, Difficulty Level, Higher Education, Test Construction
Peer reviewedKalat, James W.; Matlin, Margaret W. – Teaching of Psychology, 2000
Provides an overview of the Graduate Record Examination (GRE) Psychology test focusing on the scoring system for the GRE Psychology test, who prepares the test and how the test is prepared, and the usefulness of the GRE Psychology test. Explores some future directions for the test. (CMK)
Descriptors: Comparative Analysis, Grade Point Average, Graduate Study, Higher Education
Peer reviewedSchnucker, Robert V. – History Teacher, 1991
Describes the development of a history assessment test by Northeast Missouri State University (Kirksville) faculty. Discusses content, problems with design and cooperation, and theories of assessment testing. Includes sample questions demonstrating the cube plan, a technique that involves using a single question to measure three learning…
Descriptors: Achievement Tests, Educational Assessment, Evaluation Methods, Higher Education
Monahan, Thomas C. – 1991
The practice of requiring students seeking admission to a graduate program at Glassboro State College to take the Graduate Record Examination (GRE) is discussed. The paper first reviews the GRE Program: the components of the general test itself; scoring; quantitative measurements; and subject areas. Next, the use of the GRE scores and how they are…
Descriptors: Academic Standards, Admission Criteria, College Admission, College Entrance Examinations
Gu, Lixiong; Drake, Samuel; Wolfe, Edward W. – Journal of Technology, Learning, and Assessment, 2006
This study seeks to determine whether item features are related to observed differences in item difficulty (DIF) between computer- and paper-based test delivery media. Examinees responded to 60 quantitative items similar to those found on the GRE general test in either a computer-based or paper-based medium. Thirty-eight percent of the items were…
Descriptors: Test Bias, Test Items, Educational Testing, Student Evaluation
Graf, Edith Aurora; Peterson, Stephen; Steffen, Manfred; Lawless, René – ETS Research Report Series, 2005
We describe the item modeling development and evaluation process as applied to a quantitative assessment with high-stakes outcomes. In addition to expediting the item-creation process, a model-based approach may reduce pretesting costs, if the difficulty and discrimination of model-generated items may be predicted to a predefined level of…
Descriptors: Psychometrics, Accuracy, Item Analysis, High Stakes Tests


