Descriptor
| Problem Solving | 2 |
| Test Format | 2 |
| Test Items | 2 |
| Algebra | 1 |
| Architecture | 1 |
| Automation | 1 |
| Cognitive Style | 1 |
| College Entrance Examinations | 1 |
| Computer Science | 1 |
| Constructed Response | 1 |
| Difficulty Level | 1 |
| More ▼ | |
Author
| Bennett, Randy Elliot | 2 |
| Berger, Aliza E. | 1 |
| Friedman, Debra E. | 1 |
| Katz, Irvin R. | 1 |
| Martinez, Michael E. | 1 |
Publication Type
| Information Analyses | 1 |
| Journal Articles | 1 |
| Reports - Research | 1 |
| Tests/Questionnaires | 1 |
Education Level
| High Schools | 1 |
| Higher Education | 1 |
| Postsecondary Education | 1 |
| Secondary Education | 1 |
Audience
Location
| New Jersey | 1 |
Laws, Policies, & Programs
Assessments and Surveys
| SAT (College Admission Test) | 1 |
What Works Clearinghouse Rating
Peer reviewedMartinez, Michael E.; Bennett, Randy Elliot – Applied Measurement in Education, 1992
New developments in the use of automatically scorable constructed response item types for large-scale assessment are reviewed for five domains: (1) mathematical reasoning; (2) algebra problem solving; (3) computer science; (4) architecture; and (5) natural language. Ways in which these technologies are likely to shape testing are considered. (SLD)
Descriptors: Algebra, Architecture, Automation, Computer Science
Katz, Irvin R.; Friedman, Debra E.; Bennett, Randy Elliot; Berger, Aliza E. – College Entrance Examination Board, 1996
This study investigated the strategies subjects adopted to solve STEM-equivalent SAT-Mathematics (SAT-M) word problems in constructed-response (CR) and multiple-choice (MC) formats. Parallel test forms of CR and MC items were administered to subjects representing a range of mathematical abilities. Format-related differences in difficulty were more…
Descriptors: Multiple Choice Tests, College Entrance Examinations, Problem Solving, Cognitive Style


