Descriptor
Author
Bennett, Randy Elliot | 8 |
Berger, Aliza E. | 2 |
Katz, Irvin R. | 2 |
Friedman, Debra E. | 1 |
Martinez, Michael E. | 1 |
Publication Type
Reports - Research | 5 |
Journal Articles | 4 |
Information Analyses | 1 |
Reports - Descriptive | 1 |
Reports - Evaluative | 1 |
Speeches/Meeting Papers | 1 |
Tests/Questionnaires | 1 |
Education Level
High Schools | 1 |
Higher Education | 1 |
Postsecondary Education | 1 |
Secondary Education | 1 |
Audience
Location
New Jersey | 1 |
Laws, Policies, & Programs
Assessments and Surveys
SAT (College Admission Test) | 2 |
Advanced Placement… | 1 |
Graduate Record Examinations | 1 |
What Works Clearinghouse Rating

Katz, Irvin R.; Bennett, Randy Elliot; Berger, Aliza E. – Journal of Educational Measurement, 2000
Studied the solution strategies of 55 high school students who solved parallel constructed response and multiple-choice items that differed only in the presence of response options. Differences in difficulty between response formats did not correspond to differences in strategy choice. Interprets results in light of the relative comprehension…
Descriptors: College Entrance Examinations, Constructed Response, Difficulty Level, High School Students

Bennett, Randy Elliot; And Others – Journal of Educational Measurement, 1989
Causes of differential item difficulty for blind students taking the braille edition of the Scholastic Aptitude Test's mathematical section were studied. Data for 261 blind students were compared with data for 8,015 non-handicapped students. Results show an association between selected item categories and differential item functioning. (TJH)
Descriptors: Braille, College Entrance Examinations, Comparative Analysis, Difficulty Level
Bennett, Randy Elliot – 1998
This paper offers a scenario for how educational assessment might change in response to market forces that affect not only the future of large-scale testing but also society in general. The scenario divides into three generations distinguished by the purpose of testing, test format and content, and the extent to which testing capitalizes on new…
Descriptors: Accountability, Computer Assisted Testing, Educational Assessment, Educational Planning

Martinez, Michael E.; Bennett, Randy Elliot – Applied Measurement in Education, 1992
New developments in the use of automatically scorable constructed response item types for large-scale assessment are reviewed for five domains: (1) mathematical reasoning; (2) algebra problem solving; (3) computer science; (4) architecture; and (5) natural language. Ways in which these technologies are likely to shape testing are considered. (SLD)
Descriptors: Algebra, Architecture, Automation, Computer Science
Bennett, Randy Elliot; And Others – 1991
This study investigated the convergent validity of expert-system scores for four mathematical constructed-response item formats. A five-factor model was proposed comprised of four constructed-response format factors and a Graduate Record Examinations (GRE) General Test quantitative factor. Subjects were drawn from examinees taking a single form of…
Descriptors: College Students, Constructed Response, Correlation, Expert Systems

Bennett, Randy Elliot; And Others – Journal of Educational Measurement, 1991
The relationship of multiple-choice and free-response items on the College Board's Advanced Placement Computer Science Examination was studied using confirmatory factor analysis. Results with 2 samples of 1,000 high school students suggested that the most parsimonious fit was achieved using a single factor. Implications for construct validity are…
Descriptors: Chi Square, College Entrance Examinations, Comparative Testing, Computer Science
Bennett, Randy Elliot – 1994
The Educational Testing Service is moving rapidly to computerize its tests for admissions to postsecondary education and occupational licensure/certification. Computerized tests offer important advantages, including immediate score reporting, the convenience of testing when the examinee wishes, and for adaptive tests, equal accuracy throughout the…
Descriptors: Adaptive Testing, College Entrance Examinations, Computer Assisted Testing, Computer Managed Instruction
Katz, Irvin R.; Friedman, Debra E.; Bennett, Randy Elliot; Berger, Aliza E. – College Entrance Examination Board, 1996
This study investigated the strategies subjects adopted to solve STEM-equivalent SAT-Mathematics (SAT-M) word problems in constructed-response (CR) and multiple-choice (MC) formats. Parallel test forms of CR and MC items were administered to subjects representing a range of mathematical abilities. Format-related differences in difficulty were more…
Descriptors: Multiple Choice Tests, College Entrance Examinations, Problem Solving, Cognitive Style