NotesFAQContact Us
Collection
Advanced
Search Tips
Showing 1,741 to 1,755 of 9,530 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Sahin, Muhittin; Aydin, Furkan; Sulak, Sema; Müftüoglu, Cennet Terzi; Tepgeç, Mustafa; Yilmaz, Gizem Karaoglan; Yilmaz, Ramazan; Yurdugül, Halil – International Association for Development of the Information Society, 2021
The use of technology for teaching and learning has created a paradigm shifting in learning environments and learning process, and also the paradigm shifting has also affected the assessment processes. In addition to these, online environments provide more opportunities to assess of the learners. In this study, the Adaptive Mastery Testing (AMT)…
Descriptors: Teaching Methods, Learning Processes, Adaptive Testing, Computer Assisted Testing
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Paul J. Walter; Edward Nuhfer; Crisel Suarez – Numeracy, 2021
We introduce an approach for making a quantitative comparison of the item response curves (IRCs) of any two populations on a multiple-choice test instrument. In this study, we employ simulated and actual data. We apply our approach to a dataset of 12,187 participants on the 25-item Science Literacy Concept Inventory (SLCI), which includes ample…
Descriptors: Item Analysis, Multiple Choice Tests, Simulation, Data Analysis
Lina Anaya; Nagore Iriberri; Pedro Rey-Biel; Gema Zamarro – Annenberg Institute for School Reform at Brown University, 2021
Standardized assessments are widely used to determine access to educational resources with important consequences for later economic outcomes in life. However, many design features of the tests themselves may lead to psychological reactions influencing performance. In particular, the level of difficulty of the earlier questions in a test may…
Descriptors: Test Construction, Test Wiseness, Test Items, Difficulty Level
Yanan Feng – ProQuest LLC, 2021
This dissertation aims to investigate the effect size measures of differential item functioning (DIF) detection in the context of cognitive diagnostic models (CDMs). A variety of DIF detection techniques have been developed in the context of CDMs. However, most of the DIF detection procedures focus on the null hypothesis significance test. Few…
Descriptors: Effect Size, Item Response Theory, Cognitive Measurement, Models
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Nicky Roberts; Qetelo M. Moloi; Thelma Mort – South African Journal of Childhood Education, 2024
Background: In Initial Teacher Education (ITE) programmes, there are concerns about student teachers' English language proficiency. Aim: To discuss the first iteration of the PrimTEd English language and literacy test and analyse the results for information about the test instrument and about student teacher attainment. Setting: South African…
Descriptors: Curriculum Design, Teacher Education Curriculum, Teacher Education Programs, Program Design
Peer reviewed Peer reviewed
Direct linkDirect link
Schweizer, Karl; Troche, Stefan – Educational and Psychological Measurement, 2018
In confirmatory factor analysis quite similar models of measurement serve the detection of the difficulty factor and the factor due to the item-position effect. The item-position effect refers to the increasing dependency among the responses to successively presented items of a test whereas the difficulty factor is ascribed to the wide range of…
Descriptors: Investigations, Difficulty Level, Factor Analysis, Models
Peer reviewed Peer reviewed
Direct linkDirect link
Wang, Benchi; Theeuwes, Jan; Olivers, Christian N. L. – Journal of Experimental Psychology: Learning, Memory, and Cognition, 2018
Evidence shows that visual working memory (VWM) is strongly served by attentional mechanisms, whereas other evidence shows that VWM representations readily survive when attention is being taken away. To reconcile these findings, we tested the hypothesis that directing attention away makes a memory representation vulnerable to interference from the…
Descriptors: Short Term Memory, Interference (Learning), Test Items, Foreign Countries
Peer reviewed Peer reviewed
Direct linkDirect link
Babcock, Sarah E.; Wilson, Claire A.; Lau, Chloe – Canadian Journal of School Psychology, 2018
This article describes and reviews The School Motivation and Learning Strategies Inventory (SMALSI™; Stroud & Reynolds, 2006), published by Western Psychological Services, a self-report inventory designed to assess academic motivation, as well as learning and study strategies. The test identifies 10 primary constructs, referred to broadly as…
Descriptors: Motivation, Measures (Individuals), Test Anxiety, Test Wiseness
Peer reviewed Peer reviewed
Direct linkDirect link
Marek, Keith A.; Raker, Jeffrey R.; Holme, Thomas A.; Murphy, Kristen L. – Journal of Chemical Education, 2018
The American Chemical Society-Examinations Institute (ACS-EI) released the first ACS Foundations of Inorganic Chemistry Examination in 2016 to better address the two generalized types of inorganic chemistry courses (i.e., at the foundation or second/third-year level and the indepth or third/fourth-year level). The ACS Inorganic Chemistry Exam that…
Descriptors: Science Tests, Inorganic Chemistry, Test Items, Concept Mapping
Peer reviewed Peer reviewed
Direct linkDirect link
Holmes, Stephen D.; Meadows, Michelle; Stockford, Ian; He, Qingping – International Journal of Testing, 2018
The relationship of expected and actual difficulty of items on six mathematics question papers designed for 16-year olds in England was investigated through paired comparison using experts and testing with students. A variant of the Rasch model was applied to the comparison data to establish a scale of expected difficulty. In testing, the papers…
Descriptors: Foreign Countries, Secondary School Students, Mathematics Tests, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Adedokun, Omolola A. – Journal of Extension, 2018
This article provides an illustrative description of the pre-post difference index (PPDI), a simple, nontechnical yet robust tool for examining the instructional sensitivity of assessment items. Extension educators often design pretest-posttest instruments to assess the impact of their curricula on participants' knowledge and understanding of the…
Descriptors: Extension Education, Extension Agents, Pretests Posttests, Curriculum Evaluation
McBrien, Sarah B. – ProQuest LLC, 2018
The sentiment that there is more work to be done than there is time is pervasive among faculty members at most academic institutions. At health science centers, faculty members often balancing teaching responsibilities, clinical loads, and research endeavors. Creative use of educational support staff may provide institutions an avenue for…
Descriptors: Multiple Choice Tests, Test Items, Psychometrics, Item Banks
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Teneqexhi, Romeo; Kuneshka, Loreta; Naço, Adrian – International Association for Development of the Information Society, 2018
Organizing exams or competitions with multiple choice questions and assessment by technology today is something that happens in many educational institutions around the world. These kinds of exams or tests as a rule are done by answering questions in a so-called answer sheet form. In this form, each student or participant in the exam is obliged to…
Descriptors: Foreign Countries, Competition, Multiple Choice Tests, Computer Assisted Testing
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Herrmann-Abell, Cari F.; Hardcastle, Joseph; DeBoer, George E. – Grantee Submission, 2018
This paper describes the development and validation of a set of three assessment instruments that can be used to assess students' progress on the energy concept (ASPECt) from fourth through twelfth grade. Rasch analysis techniques were used throughout the development process to guide the construction of an item bank and the selection of items for…
Descriptors: Energy, Test Content, Test Items, Program Validation
Peer reviewed Peer reviewed
Direct linkDirect link
Halpern-Manners, Andrew; Warren, John Robert; Torche, Florencia – Sociological Methods & Research, 2017
Does participation in one wave of a survey have an effect on respondents' answers to questions in subsequent waves? In this article, we investigate the presence and magnitude of "panel conditioning" effects in one of the most frequently used data sets in the social sciences: the General Social Survey (GSS). Using longitudinal records…
Descriptors: Surveys, Participation, Conditioning, Test Wiseness
Pages: 1  |  ...  |  113  |  114  |  115  |  116  |  117  |  118  |  119  |  120  |  121  |  ...  |  636