NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 8 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Matejak Cvenic, Karolina; Planinic, Maja; Susac, Ana; Ivanjek, Lana; Jelicic, Katarina; Hopf, Martin – Physical Review Physics Education Research, 2022
A new diagnostic instrument, the Conceptual Survey on Wave Optics (CSWO), was developed and validated on 224 high school students (aged 18-19 years) in Croatia. The process of test construction, which included the testing of 61 items on the total of 712 students is presented. The final version of the test consists of 26 multiple-choice items which…
Descriptors: Scientific Concepts, Concept Formation, Validity, Physics
Peer reviewed Peer reviewed
Direct linkDirect link
Ihme, Jan Marten; Senkbeil, Martin; Goldhammer, Frank; Gerick, Julia – European Educational Research Journal, 2017
The combination of different item formats is found quite often in large scale assessments, and analyses on the dimensionality often indicate multi-dimensionality of tests regarding the task format. In ICILS 2013, three different item types (information-based response tasks, simulation tasks, and authoring tasks) were used to measure computer and…
Descriptors: Foreign Countries, Computer Literacy, Information Literacy, International Assessment
Peer reviewed Peer reviewed
Direct linkDirect link
Funk, Steven C.; Dickson, K. Laurie – Teaching of Psychology, 2011
The authors experimentally investigated the effects of multiple-choice and short-answer format exam items on exam performance in a college classroom. They randomly assigned 50 students to take a 10-item short-answer pretest or posttest on two 50-item multiple-choice exams in an introduction to personality course. Students performed significantly…
Descriptors: Test Items, Test Format, Multiple Choice Tests, Validity
Peer reviewed Peer reviewed
Haladyna, Thomas M.; Downing, Steven M. – Applied Measurement in Education, 1989
Results of 96 theoretical/empirical studies were reviewed to see if they support a taxonomy of 43 rules for writing multiple-choice test items. The taxonomy is the result of an analysis of 46 textbooks dealing with multiple-choice item writing. For nearly half of the rules, no research was found. (SLD)
Descriptors: Classification, Literature Reviews, Multiple Choice Tests, Test Construction
Haladyna, Thomas M. – 1999
This book explains writing effective multiple-choice test items and studying responses to items to evaluate and improve them, two topics that are very important in the development of many cognitive tests. The chapters are: (1) "Providing a Context for Multiple-Choice Testing"; (2) "Constructed-Response and Multiple-Choice Item Formats"; (3)…
Descriptors: Constructed Response, Multiple Choice Tests, Test Construction, Test Format
Peer reviewed Peer reviewed
Crehan, Kevin; Haladyna, Thomas M. – Journal of Experimental Education, 1991
Two item-writing rules were tested: phrasing stems as questions versus partial sentences; and using the "none-of-the-above" option instead of a specific content option. Results with 228 college students do not support the use of either stem type and provide limited evidence to caution against the "none-of-the-above" option.…
Descriptors: College Students, Higher Education, Multiple Choice Tests, Test Construction
Peer reviewed Peer reviewed
Direct linkDirect link
Dudley, Albert – Language Testing, 2006
This study examined the multiple true-false (MTF) test format in second language testing by comparing multiple-choice (MCQ) and multiple true-false (MTF) test formats in two language areas of general English: vocabulary and reading. Two counter-balanced experimental designs--one for each language area--were examined in terms of the number of MCQ…
Descriptors: Second Language Learning, Test Format, Validity, Testing
Maihoff, N. A.; Mehrens, Wm. A. – 1985
A comparison is presented of alternate-choice and true-false item forms used in an undergraduate natural science course. The alternate-choice item is a modified two-choice multiple-choice item in which the two responses are included within the question stem. This study (1) compared the difficulty level, discrimination level, reliability, and…
Descriptors: Classroom Environment, College Freshmen, Comparative Analysis, Comparative Testing