NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Does not meet standards1
Showing 1 to 15 of 26 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Rovšek, Barbara – Physics Teacher, 2021
At a conference on educational physics, someone described a series of multiple-choice problems to test students' ideas about various mechanical phenomena. One of the problems questioned students' conceptions about the shape of the Earth's orbit in the solar system. The question was as follows: "Which of the following schematic illustrations…
Descriptors: Physics, Science Tests, Multiple Choice Tests, Astronomy
Peer reviewed Peer reviewed
Direct linkDirect link
Lazenby, Katherine; Balabanoff, Morgan E.; Becker, Nicole M.; Moon, Alena; Barbera, Jack – Journal of Chemical Education, 2021
Identifying effective methods of assessment and developing robust assessments are key areas of research in chemistry education. This research is needed to evaluate instructional innovations and curricular reform. In this primer, we advocate for the use of a type of assessment, ordered multiple-choice (OMC), across postsecondary chemistry. OMC…
Descriptors: Test Construction, Multiple Choice Tests, College Science, STEM Education
Peer reviewed Peer reviewed
Direct linkDirect link
Tan, Kim Chwee Daniel; Lim, Xin Ying; Talbot, Christopher David – School Science Review, 2021
Online multiple-choice instruments to diagnose students' understanding of science concepts are easily developed and administered to students using Google Forms. This article describes a web-based three-tier multiple-choice test for the formative assessment of students' understanding of chemical bonding. The results obtained can lead to follow-up…
Descriptors: Multiple Choice Tests, Computer Assisted Testing, Science Tests, Scientific Concepts
Peer reviewed Peer reviewed
Direct linkDirect link
Trate, Jaclyn M.; Fisher, Victoria; Blecking, Anja; Geissinger, Peter; Murphy, Kristen L. – Journal of Chemical Education, 2019
Assessment and evaluation tools and instruments are developed to measure many things from content knowledge to misconceptions to student affect. The standard validation processes for these are regularly conducted and provide strong evidence for the validity of the measurements that are made. As part of the suite of validation tools available to…
Descriptors: Test Validity, Multiple Choice Tests, Chemistry, Science Tests
Peer reviewed Peer reviewed
Direct linkDirect link
van de Heyde, Valentino; Siebrits, Andre – Physics Teacher, 2019
In any science field, including physics, it is important to remain abreast of new assessment methods to cater to the 21st-century student. The rationale of this paper is to argue for a move away from the use of lower-order thinking skills (LOTS) in e-assessment in favor of higher-order thinking skills (HOTS), in line with Bloom's Revised Taxonomy.…
Descriptors: Thinking Skills, Physics, Science Instruction, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Prud'homme-Généreux, Annie – Journal of College Science Teaching, 2017
Misconceptions are sometimes called "alternative conceptions" in acknowledgement of the fact that although these concepts are inaccurate, they are congruent with prior experiences. The idea that misconceptions must be addressed to improve learning is helpful to remember when developing a case study. Students will bring their existing…
Descriptors: Case Studies, Misconceptions, Science Instruction, Science Curriculum
Kevin Morgan – Journal of Chemical Education, 2023
Practical classes are an important and essential part of undergraduate programs in Chemical Engineering, as each experiment provides an opportunity to reinforce the theory of discrete unit operations that are taught elsewhere in the course. While an expensive pedagogical method, when practical sessions are delivered well, they can be one of the…
Descriptors: Student Experience, Chemical Engineering, Science Laboratories, Laboratory Training
Peer reviewed Peer reviewed
Direct linkDirect link
Goodhead, Lauren K.; MacMillan, Frances M. – Advances in Physiology Education, 2019
The authors have experienced increasing demand from undergraduate students, particularly those in the early years of study, to be able to access more "test-style" material to help with revision, as well as guidance on how to approach their university assessments. With increased use of multiple-choice questions (MCQs) in university…
Descriptors: Science Tests, Science Instruction, Physiology, Audience Response Systems
Peer reviewed Peer reviewed
Direct linkDirect link
Labudde, Peter – Contributions from Science Education Research, 2019
In this chapter, eight foci relevant for science educators are presented and illustrated by paradigmatic examples and discussed in two parts. The first part "from science education to practice and authorities" discusses the four foci of: (1) developing concepts for instruction; (2) responding to the needs of schools; (3) connecting…
Descriptors: Science Education, Educational Practices, Educational Policy, Science Curriculum
Haladyna, Thomas M. – IDEA Center, Inc., 2018
Writing multiple-choice test items to measure student learning in higher education is a challenge. Based on extensive scholarly research and experience, the author describes various item formats, offers guidelines for creating these items, and provides many examples of both good and bad test items. He also suggests some shortcuts for developing…
Descriptors: Test Construction, Multiple Choice Tests, Test Items, Higher Education
Peer reviewed Peer reviewed
Direct linkDirect link
Dick-Perez, Marilu; Luxford, Cynthia J.; Windus, Theresa L.; Holme, Thomas – Journal of Chemical Education, 2016
A 14-item, multiple-choice diagnostic assessment tool, the quantum chemistry concept inventory or QCCI, is presented. Items were developed based on published student misconceptions and content coverage and then piloted and used in advanced physical chemistry undergraduate courses. In addition to the instrument itself, data from both a pretest,…
Descriptors: Chemistry, Science Instruction, Multiple Choice Tests, Undergraduate Students
McSparrin-Gallagher, Brenda; Tang, Yun; Niemeier, Brian; Zoblotsky, Todd – Center for Research in Educational Policy (CREP), 2015
In August 2010, the Smithsonian Science Education Center (SSEC) received a grant of more than $25 million from the U.S. Department of Education's Investing in Innovation (i3) program for a five-year study to validate its Leadership Assistance for Science Education Reform (LASER) model in three very diverse regions of the United States: rural North…
Descriptors: Multiple Choice Tests, Science Tests, Elementary School Students, Middle School Students
Peer reviewed Peer reviewed
Direct linkDirect link
Borda, Emily J.; Boudreaux, Andrew; Fackler-Adams, Ben; Frazey, Paul; Julin, Sara; Pennington, Gregory; Ogle, Jared – Journal of College Science Teaching, 2017
Passive, lecture-based forms of instruction are often ineffective in helping students develop deep conceptual understanding of scientific concepts. Student-centered forms of instruction based in a constructivist framework, where students are guided toward actively constructing their understanding, have been met with greater success. However,…
Descriptors: Student Centered Curriculum, Chemistry, Science Instruction, Scores
Peer reviewed Peer reviewed
Direct linkDirect link
Domyancich, John M. – Journal of Chemical Education, 2014
Multiple-choice questions are an important part of large-scale summative assessments, such as the advanced placement (AP) chemistry exam. However, past AP chemistry exam items often lacked the ability to test conceptual understanding and higher-order cognitive skills. The redesigned AP chemistry exam shows a distinctive shift in item types toward…
Descriptors: Multiple Choice Tests, Science Instruction, Chemistry, Summative Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Hartman, JudithAnn R.; Lin, Shirley – Journal of Chemical Education, 2011
The percentage of students choosing the correct answer (PSCA) on 17 multiple-choice algorithmic questions taken from general chemistry exams is analyzed. PSCAs for these questions varied from 47 to 93%, and a decrease of 4.5% in PSCA was observed with each additional step in the algorithm required for solving the problem (R[superscript 2] = 0.80).…
Descriptors: Chemistry, Science Tests, Multiple Choice Tests, Student Evaluation
Previous Page | Next Page »
Pages: 1  |  2