NotesFAQContact Us
Collection
Advanced
Search Tips
What Works Clearinghouse Rating
Showing 31 to 45 of 492 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Emery-Wetherell, Meaghan; Wang, Ruoyao – Assessment & Evaluation in Higher Education, 2023
Over four semesters of a large introductory statistics course the authors found students were engaging in contract cheating on Chegg.com during multiple choice examinations. In this paper we describe our methodology for identifying, addressing and eventually eliminating cheating. We successfully identified 23 out of 25 students using a combination…
Descriptors: Computer Assisted Testing, Multiple Choice Tests, Cheating, Identification
Peer reviewed Peer reviewed
Direct linkDirect link
Ugray, Zsolt; Dunn, Brian K. – Journal of Information Systems Education, 2022
As Information Systems courses have become both more data-focused and student numbers have increased, there has emerged a greater need to assess technical and analytical skills more efficiently and effectively. Multiple-choice examinations provide a means for accomplishing this, though creating effective multiple-choice assessment items within a…
Descriptors: Quality Assurance, Information Systems, Computer Science Education, Student Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Joshua B. Gilbert; Luke W. Miratrix; Mridul Joshi; Benjamin W. Domingue – Journal of Educational and Behavioral Statistics, 2025
Analyzing heterogeneous treatment effects (HTEs) plays a crucial role in understanding the impacts of educational interventions. A standard practice for HTE analysis is to examine interactions between treatment status and preintervention participant characteristics, such as pretest scores, to identify how different groups respond to treatment.…
Descriptors: Causal Models, Item Response Theory, Statistical Inference, Psychometrics
Peer reviewed Peer reviewed
Direct linkDirect link
Eder Hernandez; Esmeralda Campos; Pablo Barniol; Genaro Zavala – Physical Review Physics Education Research, 2025
This study presents the development and validation of a novel multiple-choice test designed to assess university students' conceptual understanding of electric field, force, and flux. The test of understanding of electric field, force, and flux was constructed based on the results of previous studies using a phenomenographic approach to classify…
Descriptors: Physics, Scientific Concepts, Science Tests, Multiple Choice Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Musa Adekunle Ayanwale; Jamiu Oluwadamilare Amusa; Adekunle Ibrahim Oladejo; Funmilayo Ayedun – Interchange: A Quarterly Review of Education, 2024
The study focuses on assessing the proficiency levels of higher education students, specifically the physics achievement test (PHY 101) at the National Open University of Nigeria (NOUN). This test, like others, evaluates various aspects of knowledge and skills simultaneously. However, relying on traditional models for such tests can result in…
Descriptors: Item Response Theory, Difficulty Level, Item Analysis, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Yiting Wang; Xiumei Feng; Yuchen Jiang; Li Xie; Min Xia; Lei Bao – Physical Review Physics Education Research, 2025
Understanding particle motion in force fields (PMFF), which encompasses the nature of forces and the relationship between force and motion, is fundamental to mastering mechanics and electromagnetism. Effectively solving PMFF-related problems requires advanced reasoning skills and the ability to apply knowledge across diverse contexts. Despite…
Descriptors: Physics, Difficulty Level, Science Instruction, Scientific Concepts
Peer reviewed Peer reviewed
Direct linkDirect link
Musa Adekunle Ayanwale – Discover Education, 2023
Examination scores obtained by students from the West African Examinations Council (WAEC), and National Business and Technical Examinations Board (NABTEB) may not be directly comparable due to differences in examination administration, item characteristics of the subject in question, and student abilities. For more accurate comparisons, scores…
Descriptors: Equated Scores, Mathematics Tests, Test Items, Test Format
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Saatcioglu, Fatima Munevver; Atar, Hakan Yavuz – International Journal of Assessment Tools in Education, 2022
This study aims to examine the effects of mixture item response theory (IRT) models on item parameter estimation and classification accuracy under different conditions. The manipulated variables of the simulation study are set as mixture IRT models (Rasch, 2PL, 3PL); sample size (600, 1000); the number of items (10, 30); the number of latent…
Descriptors: Accuracy, Classification, Item Response Theory, Programming Languages
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Dina Kamber Hamzic; Mirsad Trumic; Ismar Hadžalic – International Electronic Journal of Mathematics Education, 2025
Trigonometry is an important part of secondary school mathematics, but it is usually challenging for students to understand and learn. Since trigonometry is learned and used at a university level in many fields, like physics or geodesy, it is important to have an insight into students' trigonometry knowledge before the beginning of the university…
Descriptors: Trigonometry, Mathematics Instruction, Prior Learning, Outcomes of Education
Peer reviewed Peer reviewed
Direct linkDirect link
Ferrara, Steve; Steedle, Jeffrey T.; Frantz, Roger S. – Applied Measurement in Education, 2022
Item difficulty modeling studies involve (a) hypothesizing item features, or item response demands, that are likely to predict item difficulty with some degree of accuracy; and (b) entering the features as independent variables into a regression equation or other statistical model to predict difficulty. In this review, we report findings from 13…
Descriptors: Reading Comprehension, Reading Tests, Test Items, Item Response Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Berger, Stéphanie; Verschoor, Angela J.; Eggen, Theo J. H. M.; Moser, Urs – Journal of Educational Measurement, 2019
Calibration of an item bank for computer adaptive testing requires substantial resources. In this study, we investigated whether the efficiency of calibration under the Rasch model could be enhanced by improving the match between item difficulty and student ability. We introduced targeted multistage calibration designs, a design type that…
Descriptors: Simulation, Computer Assisted Testing, Test Items, Difficulty Level
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Tim Jacobbe; Bob delMas; Brad Hartlaub; Jeff Haberstroh; Catherine Case; Steven Foti; Douglas Whitaker – Numeracy, 2023
The development of assessments as part of the funded LOCUS project is described. The assessments measure students' conceptual understanding of statistics as outlined in the GAISE PreK-12 Framework. Results are reported from a large-scale administration to 3,430 students in grades 6 through 12 in the United States. Items were designed to assess…
Descriptors: Statistics Education, Common Core State Standards, Student Evaluation, Elementary School Students
Ali Türkdogan – Online Submission, 2023
This study was carried out in order to determine how the 3rd grade students of the Department of Elementary Mathematics Education structured their "if and only if propositions". The data were obtained by examining the students' answers given to the midterm exam questions and discussing the solutions with the students in the classroom.…
Descriptors: Mathematics Instruction, Teaching Methods, Difficulty Level, Questioning Techniques
Joshua B. Gilbert; Luke W. Miratrix; Mridul Joshi; Benjamin W. Domingue – Annenberg Institute for School Reform at Brown University, 2024
Analyzing heterogeneous treatment effects (HTE) plays a crucial role in understanding the impacts of educational interventions. A standard practice for HTE analysis is to examine interactions between treatment status and pre-intervention participant characteristics, such as pretest scores, to identify how different groups respond to treatment.…
Descriptors: Causal Models, Item Response Theory, Statistical Inference, Psychometrics
Peer reviewed Peer reviewed
Direct linkDirect link
Arikan, Serkan; Aybek, Eren Can – Educational Measurement: Issues and Practice, 2022
Many scholars compared various item discrimination indices in real or simulated data. Item discrimination indices, such as item-total correlation, item-rest correlation, and IRT item discrimination parameter, provide information about individual differences among all participants. However, there are tests that aim to select a very limited number…
Descriptors: Monte Carlo Methods, Item Analysis, Correlation, Individual Differences
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9  |  10  |  11  |  ...  |  33