NotesFAQContact Us
Collection
Advanced
Search Tips
What Works Clearinghouse Rating
Showing 211 to 225 of 3,123 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Lawrence T. DeCarlo – Educational and Psychological Measurement, 2024
A psychological framework for different types of items commonly used with mixed-format exams is proposed. A choice model based on signal detection theory (SDT) is used for multiple-choice (MC) items, whereas an item response theory (IRT) model is used for open-ended (OE) items. The SDT and IRT models are shown to share a common conceptualization…
Descriptors: Test Format, Multiple Choice Tests, Item Response Theory, Models
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Hryvko, Antonina V.; Zhuk, Yurii O. – Journal of Curriculum and Teaching, 2022
A feature of the presented study is a comprehensive approach to studying the reliability problem of linguistic testing results due to the several functional and variable factors impact. Contradictions and ambiguous views of scientists on the researched issues determine the relevance of this study. The article highlights the problem of equivalence…
Descriptors: Student Evaluation, Language Tests, Test Format, Test Items
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Araneda, Sergio; Lee, Dukjae; Lewis, Jennifer; Sireci, Stephen G.; Moon, Jung Aa; Lehman, Blair; Arslan, Burcu; Keehner, Madeleine – Education Sciences, 2022
Students exhibit many behaviors when responding to items on a computer-based test, but only some of these behaviors are relevant to estimating their proficiencies. In this study, we analyzed data from computer-based math achievement tests administered to elementary school students in grades 3 (ages 8-9) and 4 (ages 9-10). We investigated students'…
Descriptors: Student Behavior, Academic Achievement, Computer Assisted Testing, Mathematics Achievement
Peer reviewed Peer reviewed
Direct linkDirect link
Betts, Joe; Muntean, William; Kim, Doyoung; Kao, Shu-chuan – Educational and Psychological Measurement, 2022
The multiple response structure can underlie several different technology-enhanced item types. With the increased use of computer-based testing, multiple response items are becoming more common. This response type holds the potential for being scored polytomously for partial credit. However, there are several possible methods for computing raw…
Descriptors: Scoring, Test Items, Test Format, Raw Scores
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Guo, Hongwen; Rios, Joseph A.; Ling, Guangming; Wang, Zhen; Gu, Lin; Yang, Zhitong; Liu, Lydia O. – ETS Research Report Series, 2022
Different variants of the selected-response (SR) item type have been developed for various reasons (i.e., simulating realistic situations, examining critical-thinking and/or problem-solving skills). Generally, the variants of SR item format are more complex than the traditional multiple-choice (MC) items, which may be more challenging to test…
Descriptors: Test Format, Test Wiseness, Test Items, Item Response Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Tam, Angela Choi Fung – Assessment & Evaluation in Higher Education, 2022
Students' perception and learning practices about online timed take-home examinations and the factors affecting students' learning practices in the presence of COVID-19 have largely been unexplored. Nine students of arts, business and science sub-degree programmes participated in this study. Semi-structured interviews and reflective journals were…
Descriptors: Foreign Countries, Two Year College Students, Student Attitudes, COVID-19
Peer reviewed Peer reviewed
Direct linkDirect link
Steedle, Jeffrey T.; Cho, Young Woo; Wang, Shichao; Arthur, Ann M.; Li, Dongmei – Educational Measurement: Issues and Practice, 2022
As testing programs transition from paper to online testing, they must study mode comparability to support the exchangeability of scores from different testing modes. To that end, a series of three mode comparability studies was conducted during the 2019-2020 academic year with examinees randomly assigned to take the ACT college admissions exam on…
Descriptors: College Entrance Examinations, Computer Assisted Testing, Scores, Test Format
Andrea Cedeno – ProQuest LLC, 2022
The relationship between computer-based and paper-based testing may vary. Students with special needs may or may not perform better on computer-based testing compared to paper-based testing. Over the past few decades, computers and technology have increased in society and in classrooms. The use of technology has increased during the era of the…
Descriptors: Computer Assisted Testing, Test Format, Special Needs Students, Reading Comprehension
Edwin Ambrosio – ProQuest LLC, 2022
Assessments are some of the most common tools used to evaluate student learning. While exams have always been a part of evaluating how well students learn and retain information, the most effective way to administer them has always been debated. However, remarkably few studies have compared online and paper testing, and even fewer have examined…
Descriptors: Computer Science Education, Computer Assisted Testing, Test Format, Performance
Peer reviewed Peer reviewed
Direct linkDirect link
Smyth, Jolene D.; Israel, Glenn D.; Newberry, Milton G.; Hull, Richard G. – Field Methods, 2019
Considerable research has examined the effect of response option order in ordinal bipolar questions such as satisfaction questions. However, no research we know of has examined the effect of the order of presentation of concepts in the question stem or whether stem order moderates response option order. In this article, we experimentally test the…
Descriptors: Satisfaction, Responses, Test Items, Attitude Measures
Steven R. Hiner – ProQuest LLC, 2023
The purpose of this study was to determine if there were significant statistical differences between scores on constructed response and computer-scorable questions on an accelerated middle school math placement test in a large urban school district in Ohio, and to ensure that all students have an opportunity to take the test. Five questions on a…
Descriptors: Scores, Middle Schools, Mathematics Tests, Placement Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Abdullah Al Fraidan; Meznah Saud Abdulaziz Alsubaie – Educational Process: International Journal, 2025
Background: This study examines the effect of test anxiety on the academic performance of postgraduate female students, focusing on their perceptions and experiences in open-book exams (OBE) and closed-book exams (CBE). Method: A qualitative case study design was employed using the Thinking Aloud Protocol (TAP) to collect data from five Saudi…
Descriptors: Test Anxiety, Vocabulary, Females, Books
Peer reviewed Peer reviewed
Direct linkDirect link
Herzing, Jessica M. E. – International Journal of Social Research Methodology, 2020
This study aims to address the questionnaire design challenges in cases wherein questions involve a large number of response options. Traditionally, these long-list questions are asked in open-ended or closed-ended formats. However, alternative interface design options are emerging in computer-assisted surveys that combine both interface designs.…
Descriptors: Foreign Countries, Questionnaires, Online Surveys, Test Format
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Gurdil Ege, Hatice; Demir, Ergul – Eurasian Journal of Educational Research, 2020
Purpose: The present study aims to evaluate how the reliabilities computed using a, Stratified a, Angoff-Feldt, and Feldt-Raju estimators may differ when sample size (500, 1000, and 2000) and item type ratio of dichotomous to polytomous items (2:1; 1:1, 1:2) included in the scale are varied. Research Methods: In this study, Cronbach's a,…
Descriptors: Test Format, Simulation, Test Reliability, Sample Size
Peer reviewed Peer reviewed
Direct linkDirect link
Hiller, Sara; Rumann, Stefan; Berthold, Kirsten; Roelle, Julian – Instructional Science: An International Journal of the Learning Sciences, 2020
In learning from examples, students are often first provided with basic instructional explanations of new principles and concepts and second with examples thereof. In this sequence, it is important that learners self-explain by generating links between the basic instructional explanations' content and the examples. Therefore, it is well…
Descriptors: Problem Solving, Test Format, Prompting, Learning Strategies
Pages: 1  |  ...  |  11  |  12  |  13  |  14  |  15  |  16  |  17  |  18  |  19  |  ...  |  209