NotesFAQContact Us
Collection
Advanced
Search Tips
What Works Clearinghouse Rating
Showing 316 to 330 of 956 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Pae, Hye K. – Educational Assessment, 2014
This study investigated the role of item formats in the performance of 206 nonnative speakers of English on expressive skills (i.e., speaking and writing). Test scores were drawn from the field test of the "Pearson Test of English Academic" for Chinese, French, Hebrew, and Korean native speakers. Four item formats, including…
Descriptors: Test Items, Test Format, Speech Skills, Writing Skills
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Frey, Bruce B.; Ellis, James D.; Bulgreen, Janis A.; Hare, Jana Craig; Ault, Marilyn – Electronic Journal of Science Education, 2015
"Scientific argumentation," defined as the ability to develop and analyze scientific claims, support claims with evidence from investigations of the natural world, and explain and evaluate the reasoning that connects the evidence to the claim, is a critical component of current science standards and is consistent with "Common Core…
Descriptors: Test Construction, Science Tests, Persuasive Discourse, Science Process Skills
Wang, Wei – ProQuest LLC, 2013
Mixed-format tests containing both multiple-choice (MC) items and constructed-response (CR) items are now widely used in many testing programs. Mixed-format tests often are considered to be superior to tests containing only MC items although the use of multiple item formats leads to measurement challenges in the context of equating conducted under…
Descriptors: Equated Scores, Test Format, Test Items, Test Length
National Assessment Governing Board, 2014
Since 1973, the National Assessment of Educational Progress (NAEP) has gathered information about student achievement in mathematics. Results of these periodic assessments, produced in print and web-based formats, provide valuable information to a wide variety of audiences. They inform citizens about the nature of students' comprehension of the…
Descriptors: National Competency Tests, Mathematics Achievement, Mathematics Skills, Grade 4
Peer reviewed Peer reviewed
Direct linkDirect link
Ackerman, David S.; DeShields, Oscar – Alberta Journal of Educational Research, 2013
This research examines whether the ordering of the difficulty of exams can influence student beliefs about their academic abilities and the impact of these beliefs on their performance. The ordering of the difficulty of test items has shown to affect performance. Study One (n = 91) examined college student differences in reaction to a difficult…
Descriptors: Beliefs, Academic Ability, Academic Achievement, Student Attitudes
Peer reviewed Peer reviewed
Direct linkDirect link
Tsopanoglou, Antonios; Ypsilandis, George S.; Mouti, Anna – Language Learning in Higher Education, 2014
Multiple-choice (MC) tests are frequently used to measure language competence because they are quick, economical and straightforward to score. While degrees of correctness have been investigated for partially correct responses in combined-response MC tests, degrees of incorrectness in distractors and the role they play in determining the…
Descriptors: Scoring, Pilot Projects, Multiple Choice Tests, Language Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Williams, Marian E.; Sando, Lara; Soles, Tamara Glen – Journal of Psychoeducational Assessment, 2014
Cognitive assessment of young children contributes to high-stakes decisions because results are often used to determine eligibility for early intervention and special education. Previous reviews of cognitive measures for young children highlighted concerns regarding adequacy of standardization samples, steep item gradients, and insufficient floors…
Descriptors: Intelligence Tests, Decision Making, High Stakes Tests, Eligibility
Lee, Eunjung; Lee, Won-Chan; Brennan, Robert L. – College Board, 2012
In almost all high-stakes testing programs, test equating is necessary to ensure that test scores across multiple test administrations are equivalent and can be used interchangeably. Test equating becomes even more challenging in mixed-format tests, such as Advanced Placement Program® (AP®) Exams, that contain both multiple-choice and constructed…
Descriptors: Test Construction, Test Interpretation, Test Norms, Test Reliability
Peer reviewed Peer reviewed
Direct linkDirect link
Whittaker, Tiffany A.; Chang, Wanchen; Dodd, Barbara G. – Applied Psychological Measurement, 2012
When tests consist of multiple-choice and constructed-response items, researchers are confronted with the question of which item response theory (IRT) model combination will appropriately represent the data collected from these mixed-format tests. This simulation study examined the performance of six model selection criteria, including the…
Descriptors: Item Response Theory, Models, Selection, Criteria
Peer reviewed Peer reviewed
Direct linkDirect link
McLean, Stuart; Kramer, Brandon; Beglar, David – Language Teaching Research, 2015
An important gap in the field of second language vocabulary assessment concerns the lack of validated tests measuring aural vocabulary knowledge. The primary purpose of this study is to introduce and provide preliminary validity evidence for the Listening Vocabulary Levels Test (LVLT), which has been designed as a diagnostic tool to measure…
Descriptors: Test Construction, Test Validity, English (Second Language), Second Language Learning
Peer reviewed Peer reviewed
Direct linkDirect link
Albano, Anthony D. – Journal of Educational Measurement, 2013
In many testing programs it is assumed that the context or position in which an item is administered does not have a differential effect on examinee responses to the item. Violations of this assumption may bias item response theory estimates of item and person parameters. This study examines the potentially biasing effects of item position. A…
Descriptors: Test Items, Item Response Theory, Test Format, Questioning Techniques
Peer reviewed Peer reviewed
Direct linkDirect link
Plassmann, Sibylle; Zeidler, Beate – Language Learning in Higher Education, 2014
Language testing means taking decisions: about the test taker's results, but also about the test construct and the measures taken in order to ensure quality. This article takes the German test "telc Deutsch C1 Hochschule" as an example to illustrate this decision-making process in an academic context. The test is used for university…
Descriptors: Language Tests, Test Wiseness, Test Construction, Decision Making
Peer reviewed Peer reviewed
Direct linkDirect link
Kirschner, Sophie; Borowski, Andreas; Fischer, Hans E.; Gess-Newsome, Julie; von Aufschnaiter, Claudia – International Journal of Science Education, 2016
Teachers' professional knowledge is assumed to be a key variable for effective teaching. As teacher education has the goal to enhance professional knowledge of current and future teachers, this knowledge should be described and assessed. Nevertheless, only a limited number of studies quantitatively measures physics teachers' professional…
Descriptors: Evaluation Methods, Tests, Test Format, Science Instruction
Chen, Jing – ProQuest LLC, 2012
Learning progressions are used to describe how students' understanding of a topic progresses over time and to classify the progress of students into steps or levels. This study applies Item Response Theory (IRT) based methods to investigate how to design learning progression-based science assessments. The research questions of this study are: (1)…
Descriptors: Science Tests, Test Items, Item Response Theory, Secondary School Students
Peer reviewed Peer reviewed
Direct linkDirect link
Kim, Sooyeon; Walker, Michael – Applied Measurement in Education, 2012
This study examined the appropriateness of the anchor composition in a mixed-format test, which includes both multiple-choice (MC) and constructed-response (CR) items, using subpopulation invariance indices. Linking functions were derived in the nonequivalent groups with anchor test (NEAT) design using two types of anchor sets: (a) MC only and (b)…
Descriptors: Multiple Choice Tests, Test Format, Test Items, Equated Scores
Pages: 1  |  ...  |  18  |  19  |  20  |  21  |  22  |  23  |  24  |  25  |  26  |  ...  |  64