NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 7 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Jonick, Christine; Schneider, Jennifer; Boylan, Daniel – Accounting Education, 2017
The purpose of the research is to examine the effect of different response formats on student performance on introductory accounting exam questions. The study analyzes 1104 accounting students' responses to quantitative questions presented in two formats: multiple-choice and fill-in. Findings indicate that response format impacts student…
Descriptors: Introductory Courses, Accounting, Test Format, Multiple Choice Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Stanger-Hall, Kathrin F. – CBE - Life Sciences Education, 2012
Learning science requires higher-level (critical) thinking skills that need to be practiced in science classes. This study tested the effect of exam format on critical-thinking skills. Multiple-choice (MC) testing is common in introductory science courses, and students in these classes tend to associate memorization with MC questions and may not…
Descriptors: Multiple Choice Tests, Science Tests, Test Format, Thinking Skills
Joseph, Dane Christian – ProQuest LLC, 2010
Multiple-choice item-writing guideline research is in its infancy. Haladyna (2004) calls for a science of item-writing guideline research. The purpose of this study is to respond to such a call. The purpose of this study was to examine the impact of student ability and method for varying the location of correct answers in classroom multiple-choice…
Descriptors: Evidence, Test Format, Guessing (Tests), Program Effectiveness
Kobrin, Jennifer L.; Kim, Rachel; Sackett, Paul – College Board, 2011
There is much debate on the merits and pitfalls of standardized tests for college admission, with questions regarding the format (multiple-choice versus constructed response), cognitive complexity, and content of these assessments (achievement versus aptitude) at the forefront of the discussion. This study addressed these questions by…
Descriptors: College Entrance Examinations, Mathematics Tests, Test Items, Predictive Validity
Peer reviewed Peer reviewed
Trevisan, Michael S.; And Others – Educational and Psychological Measurement, 1991
The reliability and validity of multiple-choice tests were computed as a function of the number of options per item and student ability for 435 parochial high school juniors, who were administered the Washington Pre-College Test Battery. Results suggest the efficacy of the three-option item. (SLD)
Descriptors: Ability, Comparative Testing, Distractors (Tests), Grade Point Average
Peer reviewed Peer reviewed
Miller, Deborah A.; And Others – Academic Medicine, 1993
A study correlated the results of 25 medical school course examinations (largely multiple-choice), a standardized critical thinking inventory, undergraduate and medical school grade point averages, and medical college admissions test scores for 196 preclinical medical students. Findings suggest that objective multiple-choice examinations can at…
Descriptors: College Entrance Examinations, Critical Thinking, Grade Point Average, Higher Education
Hendrickson, Amy; Patterson, Brian; Melican, Gerald – College Board, 2008
Presented at the Annual National Council on Measurement in Education (NCME) in New York in March 2008. This presentation explores how different item weighting can affect the effective weights, validity coefficents and test reliability of composite scores among test takers.
Descriptors: Multiple Choice Tests, Test Format, Test Validity, Test Reliability