NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 226 to 240 of 526 results Save | Export
Pechenizkiy, Mykola; Trcka, Nikola; Vasilyeva, Ekaterina; van der Aalst, Wil; De Bra, Paul – International Working Group on Educational Data Mining, 2009
Traditional data mining techniques have been extensively applied to find interesting patterns, build descriptive and predictive models from large volumes of data accumulated through the use of different information systems. The results of data mining can be used for getting a better understanding of the underlying educational processes, for…
Descriptors: Data Analysis, Methods, Computer Software, Computer Assisted Testing
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Rezaee, Abbas Ali; Shoar, Neda Sharbaf – English Language Teaching, 2011
In recent years, improvements in technology have enhanced the possibilities of teaching and learning various subjects. This is specially the case in foreign language instruction. The use of technology and multimedia brings new opportunities for learning different areas of language. In this regard, the present study attempts to find out if the use…
Descriptors: Reading Comprehension, English (Second Language), Second Language Learning, Vocabulary Development
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Kim, Sooyeon; Walker, Michael E.; McHale, Frederick – ETS Research Report Series, 2008
This study examined variations of a nonequivalent groups equating design used with constructed-response (CR) tests to determine which design was most effective in producing equivalent scores across the two tests to be equated. Using data from a large-scale exam, the study investigated the use of anchor CR item rescoring in the context of classical…
Descriptors: Equated Scores, Comparative Analysis, Test Format, Responses
Hagge, Sarah Lynn – ProQuest LLC, 2010
Mixed-format tests containing both multiple-choice and constructed-response items are widely used on educational tests. Such tests combine the broad content coverage and efficient scoring of multiple-choice items with the assessment of higher-order thinking skills thought to be provided by constructed-response items. However, the combination of…
Descriptors: Test Format, True Scores, Equated Scores, Psychometrics
Peer reviewed Peer reviewed
PDF on ERIC Download full text
DiBattista, David – Collected Essays on Learning and Teaching, 2008
Multiple-choice questions are widely used in higher education and have some important advantages over constructed-response test questions. It seems, however, that many teachers underestimate the value of multiple-choice questions, believing them to be useful only for assessing how well students can memorize information, but not for assessing…
Descriptors: Foreign Countries, Multiple Choice Tests, Retention (Psychology), Cognitive Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Yanagawa, Kozo; Green, Anthony – System: An International Journal of Educational Technology and Applied Linguistics, 2008
The purpose of this study is to examine whether the choice between three multiple-choice listening comprehension test formats results in any difference in listening comprehension test performance. The three formats entail (a) allowing test takers to preview both the question stem and answer options prior to listening; (b) allowing test takers to…
Descriptors: Listening Comprehension, Test Construction, Listening Comprehension Tests, Multiple Choice Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Graf, Edith Aurora – ETS Research Report Series, 2008
Quantitative item models are item structures that may be expressed in terms of mathematical variables and constraints. An item model may be developed as a computer program from which large numbers of items are automatically generated. Item models can be used to produce large numbers of items for use in traditional, large-scale assessments. But…
Descriptors: Test Items, Models, Diagnostic Tests, Statistical Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
von Davier, Alina A.; Wilson, Christine – Applied Psychological Measurement, 2008
Dorans and Holland (2000) and von Davier, Holland, and Thayer (2003) introduced measures of the degree to which an observed-score equating function is sensitive to the population on which it is computed. This article extends the findings of Dorans and Holland and of von Davier et al. to item response theory (IRT) true-score equating methods that…
Descriptors: Advanced Placement, Advanced Placement Programs, Equated Scores, Calculus
Peer reviewed Peer reviewed
Direct linkDirect link
Balch, William R. – Teaching of Psychology, 2007
Undergraduates studied the definitions of 16 psychology terms, expecting either a multiple-choice (n = 132) or short-answer (n = 122) test. All students then received the same multiple-choice test, requiring them to recognize the definitions as well as novel examples of the terms. Compared to students expecting a multiple-choice test, those…
Descriptors: Expectation, Definitions, Multiple Choice Tests, Undergraduate Students
Peer reviewed Peer reviewed
Direct linkDirect link
Wang, Tzu-Hua – Computers & Education, 2008
This research aims to develop a multiple-choice Web-based quiz-game-like formative assessment system, named GAM-WATA. The unique design of "Ask-Hint Strategy" turns the Web-based formative assessment into an online quiz game. "Ask-Hint Strategy" is composed of "Prune Strategy" and "Call-in Strategy".…
Descriptors: Formative Evaluation, Foreign Countries, Grade 5, Internet
Peer reviewed Peer reviewed
Direct linkDirect link
Ascalon, M. Evelina; Meyers, Lawrence S.; Davis, Bruce W.; Smits, Niels – Applied Measurement in Education, 2007
This article examined two item-writing guidelines: the format of the item stem and homogeneity of the answer set. Answering the call of Haladyna, Downing, and Rodriguez (2002) for empirical tests of item writing guidelines and extending the work of Smith and Smith (1988) on differential use of item characteristics, a mock multiple-choice driver's…
Descriptors: Guidelines, Difficulty Level, Standard Setting, Driver Education
Peer reviewed Peer reviewed
Direct linkDirect link
Hertenstein, Matthew J.; Wayand, Joseph F. – Journal of Instructional Psychology, 2008
Many psychology instructors present videotaped examples of behavior at least occasionally during their courses. However, few include video clips during examinations. We provide examples of video-based questions, offer guidelines for their use, and discuss their benefits and drawbacks. In addition, we provide empirical evidence to support the use…
Descriptors: Student Evaluation, Video Technology, Evaluation Methods, Test Construction
Tauber, Robert T. – 1984
A technique is described for reducing the incidence of cheating on multiple choice exams. One form of the test is used and each item is assigned multiple numbers. Depending upon the instructions given to the class, some students will use the first of each pair of numbers to determine where to place their responses on a separate answer sheet, while…
Descriptors: Answer Sheets, Cheating, Higher Education, Multiple Choice Tests
Peer reviewed Peer reviewed
Dodd, David K.; Leal, Linda – Teaching of Psychology, 1988
Discusses answer justification, a technique that allows students to convert multiple-choice items perceived to be "tricky" into short-answer essay questions. Convincing justifications earn students credit for missed items. The procedure is reported to be easy to administer and very popular among students. (Author/GEA)
Descriptors: Guessing (Tests), Higher Education, Multiple Choice Tests, Psychology
Peer reviewed Peer reviewed
Wilcox, Rand R. – Educational and Psychological Measurement, 1982
Results in the engineering literature on "k out of n system reliability" can be used to characterize tests based on estimates of the probability of correctly determining whether the examinee knows the correct response. In particular, the minimum number of distractors required for multiple-choice tests can be empirically determined.…
Descriptors: Achievement Tests, Mathematical Models, Multiple Choice Tests, Test Format
Pages: 1  |  ...  |  12  |  13  |  14  |  15  |  16  |  17  |  18  |  19  |  20  |  ...  |  36