NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
Pell Grant Program1
What Works Clearinghouse Rating
Showing 1 to 15 of 56 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Cui, Ying; Chen, Fu; Lutsyk, Alina; Leighton, Jacqueline P.; Cutumisu, Maria – Assessment in Education: Principles, Policy & Practice, 2023
With the exponential increase in the volume of data available in the 21st century, data literacy skills have become vitally important in work places and everyday life. This paper provides a systematic review of available data literacy assessments targeted at different audiences and educational levels. The results can help researchers and…
Descriptors: Data, Information Literacy, 21st Century Skills, Competence
Peer reviewed Peer reviewed
Direct linkDirect link
Celeste Combrinck – SAGE Open, 2024
We have less time and focus than ever before, while the demand for attention is increasing. Therefore, it is no surprise that when answering questionnaires, we often choose to strongly agree or be neutral, producing problematic and unusable data. The current study investigated forced-choice (ipsative) format compared to the same questions on a…
Descriptors: Likert Scales, Test Format, Surveys, Design
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Jang, Jung Un; Kim, Eun Joo – Journal of Curriculum and Teaching, 2022
This study conducts the validity of the pen-and-paper and smart-device-based tests on optician's examination. The developed questions for each media were based on the national optician's simulation test. The subjects of this study were 60 students enrolled in E University. The data analysis was performed to verify the equivalence of the two…
Descriptors: Optometry, Licensing Examinations (Professions), Test Format, Test Validity
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Duru, Erdinc; Ozgungor, Sevgi; Yildirim, Ozen; Duatepe-Paksu, Asuman; Duru, Sibel – International Journal of Assessment Tools in Education, 2022
The aim of this study is to develop a valid and reliable measurement tool that measures critical thinking skills of university students. Pamukkale Critical Thinking Skills Scale was developed as two separate forms; multiple choice and open-ended. The validity and reliability studies of the multiple-choice form were constructed on two different…
Descriptors: Critical Thinking, Cognitive Measurement, Test Validity, Test Reliability
Crystal Uminski – ProQuest LLC, 2023
The landscape of undergraduate biology education has been shaped by decades of reform efforts calling for instruction to integrate core concepts and scientific skills as a means of helping students become proficient in the discipline. Assessments can be used to make inferences about how these reform efforts have translated into changes in…
Descriptors: Undergraduate Students, Biology, Science Instruction, Science Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Stefan O'Grady – International Journal of Listening, 2025
Language assessment is increasingly computermediated. This development presents opportunities with new task formats and equally a need for renewed scrutiny of established conventions. Recent recommendations to increase integrated skills assessment in lecture comprehension tests is premised on empirical research that demonstrates enhanced construct…
Descriptors: Language Tests, Lecture Method, Listening Comprehension Tests, Multiple Choice Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
David Bell; Vikki O'Neill; Vivienne Crawford – Practitioner Research in Higher Education, 2023
We compared the influence of open-book extended duration versus closed book time-limited format on reliability and validity of written assessments of pharmacology learning outcomes within our medical and dental courses. Our dental cohort undertake a mid-year test (30xfree-response short answer to a question, SAQ) and end-of-year paper (4xSAQ,…
Descriptors: Undergraduate Students, Pharmacology, Pharmaceutical Education, Test Format
Peer reviewed Peer reviewed
Direct linkDirect link
Simic, Nataša; Marušic Jablanovic, Milica; Grbic, Sanja – Journal of Education for Teaching: International Research and Pedagogy, 2022
The aim of this study was to validate the structure of the "FIT-Choice scale" on a Serbian sample of pre-service teachers, as well as to determine the motivations and beliefs about the teaching profession, and test if motivation differs across different groups of pre-service teachers. After prospective class and subject teachers…
Descriptors: Foreign Countries, Likert Scales, Factor Structure, Factor Analysis
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Fitria Lafifa; Dadan Rosana – Turkish Online Journal of Distance Education, 2024
This research goal to develop a multiple-choice closed-ended test to assessing and evaluate students' digital literacy skills. The sample in this study were students at MTsN 1 Blitar City who were selected using a purposive sampling technique. The test was also validated by experts, namely 2 Doctors of Physics and Science from Yogyakarta State…
Descriptors: Educational Innovation, Student Evaluation, Digital Literacy, Multiple Choice Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Calderón Carvajal, Carlos; Ximénez Gómez, Carmen; Lay-Lisboa, Siu; Briceño, Mauricio – Journal of Psychoeducational Assessment, 2021
Kolb's Learning Style Inventory (LSI) continues to generate a great debate among researchers, given the contradictory evidence resulting from its psychometric properties. One primary criticism focuses on the artificiality of the results derived from its internal structure because of the ipsative nature of the forced-choice format. This study seeks…
Descriptors: Factor Structure, Psychometrics, Test Format, Test Validity
Peer reviewed Peer reviewed
Direct linkDirect link
Yangqiuting Li; Chandralekha Singh – Physical Review Physics Education Research, 2025
Research-based multiple-choice questions implemented in class with peer instruction have been shown to be an effective tool for improving students' engagement and learning outcomes. Moreover, multiple-choice questions that are carefully sequenced to build on each other can be particularly helpful for students to develop a systematic understanding…
Descriptors: Physics, Science Instruction, Science Tests, Multiple Choice Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Wicaksono, Azizul Ghofar Candra; Korom, Erzsébet – Participatory Educational Research, 2022
The accuracy of learning results relies on the evaluation and assessment. The learning goals, including problem solving ability must be aligned with the valid standardized measurement tools. The study on exploring the nature of problem-solving, framework, and assessment in the Indonesian context will make contributions to problem solving…
Descriptors: Problem Solving, Educational Research, Test Construction, Test Validity
Peer reviewed Peer reviewed
Direct linkDirect link
Papenberg, Martin; Diedenhofen, Birk; Musch, Jochen – Journal of Experimental Education, 2021
Testwiseness may introduce construct-irrelevant variance to multiple-choice test scores. Presenting response options sequentially has been proposed as a potential solution to this problem. In an experimental validation, we determined the psychometric properties of a test based on the sequential presentation of response options. We created a strong…
Descriptors: Test Wiseness, Test Validity, Test Reliability, Multiple Choice Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Walsh, Cole; Quinn, Katherine N.; Wieman, C.; Holmes, N. G. – Physical Review Physics Education Research, 2019
Introductory physics lab instruction is undergoing a transformation, with increasing emphasis on developing experimentation and critical thinking skills. These changes present a need for standardized assessment instruments to determine the degree to which students develop these skills through instructional labs. In this article, we present the…
Descriptors: Critical Thinking, Physics, Cognitive Tests, Science Experiments
Peer reviewed Peer reviewed
Direct linkDirect link
Karakolidis, Anastasios; O'Leary, Michael; Scully, Darina – International Journal of Testing, 2021
The linguistic complexity of many text-based tests can be a source of construct-irrelevant variance, as test-takers' performance may be affected by factors that are beyond the focus of the assessment itself, such as reading comprehension skills. This experimental study examined the extent to which the use of animated videos, as opposed to written…
Descriptors: Animation, Vignettes, Video Technology, Test Format
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4