NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 8 results Save | Export
Pawade, Yogesh R.; Diwase, Dipti S. – Journal of Educational Technology, 2016
Item analysis of Multiple Choice Questions (MCQs) is the process of collecting, summarizing and utilizing information from students' responses to evaluate the quality of test items. Difficulty Index (p-value), Discrimination Index (DI) and Distractor Efficiency (DE) are the parameters which help to evaluate the quality of MCQs used in an…
Descriptors: Test Items, Item Analysis, Multiple Choice Tests, Curriculum Development
Peer reviewed Peer reviewed
Dudycha, Arthur L.; Carpenter, James B. – Journal of Applied Psychology, 1973
In this study, three structural characteristics--stem format, inclusive versus specific distracters, and stem orientation--were selected for experimental manipulation, while the number of alternatives, the number of correct answers, and the order of items were experimentally controlled. (Author)
Descriptors: Discriminant Analysis, Item Analysis, Multiple Choice Tests, Test Construction
Peer reviewed Peer reviewed
Preece, P. F. W. – School Science Review, 1974
Describes the test item analysis used in test construction. (JR)
Descriptors: Discriminant Analysis, Evaluation Methods, Item Analysis, Multiple Choice Tests
Peer reviewed Peer reviewed
McMillan, James R.; And Others – Delta Pi Epsilon Journal, 1989
An investigation analyzed difficulty and discrimination statistics for 91 multiple-choice tests written by 46 business administration instructors and administered to 7,511 students. A large percentage of the tests failed the difficulty and discrimination standards proposed by several testing experts, implying that teachers need more preparation in…
Descriptors: Business Administration Education, Difficulty Level, Discriminant Analysis, Higher Education
Peer reviewed Peer reviewed
Frary, Robert B. – Applied Measurement in Education, 1991
The use of the "none-of-the-above" option (NOTA) in 20 college-level multiple-choice tests was evaluated for classes with 100 or more students. Eight academic disciplines were represented, and 295 NOTA and 724 regular test items were used. It appears that the NOTA can be compatible with good classroom measurement. (TJH)
Descriptors: College Students, Comparative Testing, Difficulty Level, Discriminant Analysis
Tollefson, Nona; Tripp, Alice – 1986
The item difficulty and item discrimination of three multiple-choice item formats were compared in experimental and non-experimental settings. In the experimental study, 104 graduate students were randomly assigned to complete one of three forms of a multiple-choice test: (1) a complex alternative ("none of the above") as the correct answer; (2) a…
Descriptors: Achievement Tests, Difficulty Level, Discriminant Analysis, Graduate Students
Lancaster, Diana M.; And Others – 1987
Difficulty and discrimination ability were compared between multiple choice and short answer items in midterm and final examinations for the internal medicine course at Louisiana State University School of Dentistry. The examinations were administered to 67 sophomore dental students in that course. Additionally, the impact of the source of the…
Descriptors: Dental Schools, Dentistry, Difficulty Level, Discriminant Analysis
Tinari, Frank D. – Improving College and University Teaching, 1979
Computerized analysis of multiple choice test items is explained. Examples of item analysis applications in the introductory economics course are discussed with respect to three objectives: to evaluate learning; to improve test items; and to help improve classroom instruction. Problems, costs and benefits of the procedures are identified. (JMD)
Descriptors: College Instruction, Computer Programs, Discriminant Analysis, Economics Education