Publication Date
| In 2026 | 0 |
| Since 2025 | 9 |
| Since 2022 (last 5 years) | 50 |
| Since 2017 (last 10 years) | 103 |
| Since 2007 (last 20 years) | 160 |
Descriptor
Source
Author
| Plake, Barbara S. | 7 |
| Huntley, Renee M. | 5 |
| Tollefson, Nona | 4 |
| Wainer, Howard | 4 |
| Baghaei, Purya | 3 |
| Bennett, Randy Elliot | 3 |
| Halpin, Glennelle | 3 |
| Katz, Irvin R. | 3 |
| Lunz, Mary E. | 3 |
| Allen, Nancy L. | 2 |
| Anderson, Paul S. | 2 |
| More ▼ | |
Publication Type
Education Level
Audience
| Researchers | 8 |
| Policymakers | 1 |
| Practitioners | 1 |
| Teachers | 1 |
Location
| Germany | 8 |
| Turkey | 8 |
| Australia | 5 |
| China | 4 |
| Indonesia | 4 |
| Iran | 4 |
| United Kingdom (England) | 4 |
| Canada | 3 |
| Japan | 3 |
| Netherlands | 3 |
| Taiwan | 3 |
| More ▼ | |
Laws, Policies, & Programs
| Pell Grant Program | 1 |
Assessments and Surveys
What Works Clearinghouse Rating
Harrison, Scott; Kroehne, Ulf; Goldhammer, Frank; Lüdtke, Oliver; Robitzsch, Alexander – Large-scale Assessments in Education, 2023
Background: Mode effects, the variations in item and scale properties attributed to the mode of test administration (paper vs. computer), have stimulated research around test equivalence and trend estimation in PISA. The PISA assessment framework provides the backbone to the interpretation of the results of the PISA test scores. However, an…
Descriptors: Scoring, Test Items, Difficulty Level, Foreign Countries
Lisa A. Bonner; Jalisa H. Ferguson – Journal of College Science Teaching, 2025
Condensed courses are commonly used across academia as a means for students to catch up or get ahead of their curriculum. Instructors may be hesitant to teach such courses due to concerns about limited knowledge retention and reduced academic rigor. We sought to determine if there are any differences in changes of student attitudes before and…
Descriptors: Organic Chemistry, Science Instruction, Barriers, Academic Standards
Shen, Jing; Wu, Jingwei – Journal of Speech, Language, and Hearing Research, 2022
Purpose: This study examined the performance difference between remote and in-laboratory test modalities with a speech recognition in noise task in older and younger adults. Method: Four groups of participants (younger remote, younger in-laboratory, older remote, and older in-laboratory) were tested on a speech recognition in noise protocol with…
Descriptors: Age Differences, Test Format, Computer Assisted Testing, Auditory Perception
Gruss, Richard; Clemons, Josh – Journal of Computer Assisted Learning, 2023
Background: The sudden growth in online instruction due to COVID-19 restrictions has given renewed urgency to questions about remote learning that have remained unresolved. Web-based assessment software provides instructors an array of options for varying testing parameters, but the pedagogical impacts of some of these variations has yet to be…
Descriptors: Test Items, Test Format, Computer Assisted Testing, Mathematics Tests
Gustafsson, Martin; Barakat, Bilal Fouad – Comparative Education Review, 2023
International assessments inform education policy debates, yet little is known about their floor effects: To what extent do they fail to differentiate between the lowest performers, and what are the implications of this? TIMSS, SACMEQ, and LLECE data are analyzed to answer this question. In TIMSS, floor effects have been reduced through the…
Descriptors: Achievement Tests, Elementary Secondary Education, International Assessment, Foreign Countries
Pengelley, James; Whipp, Peter R.; Rovis-Hermann, Nina – Educational Psychology Review, 2023
The aim of the present study is to reconcile previous findings (a) that testing mode has no effect on test outcomes or cognitive load (Comput Hum Behav 77:1-10, 2017) and (b) that younger learners' working memory processes are more sensitive to computer-based test formats (J Psychoeduc Assess 37(3):382-394, 2019). We addressed key methodological…
Descriptors: Scores, Cognitive Processes, Difficulty Level, Secondary School Students
van den Broek, Gesa S. E.; Gerritsen, Suzanne L.; Oomen, Iris T. J.; Velthoven, Eva; van Boxtel, Femke H. J.; Kester, Liesbeth; van Gog, Tamara – Journal of Educational Psychology, 2023
Multiple-choice questions (MCQs) are popular in vocabulary software because they can be scored automatically and are compatible with many input devices (e.g., touchscreens). Answering MCQs is beneficial for learning, especially when learners retrieve knowledge from memory to evaluate plausible answer alternatives. However, such retrieval may not…
Descriptors: Multiple Choice Tests, Vocabulary Development, Test Format, Cues
Cronin, Sean D. – ProQuest LLC, 2023
This convergent, parallel, mixed-methods study with qualitative and quantitative content analysis methods was conducted to identify what type of thinking is required by the College and Career Readiness Assessment (CCRA+) by (a) determining the frequency and percentage of questions categorized as higher-level thinking within each cell of Hess'…
Descriptors: Cues, College Readiness, Career Readiness, Test Items
Mimi Ismail; Ahmed Al - Badri; Said Al - Senaidi – Journal of Education and e-Learning Research, 2025
This study aimed to reveal the differences in individuals' abilities, their standard errors, and the psychometric properties of the test according to the two methods of applying the test (electronic and paper). The descriptive approach was used to achieve the study's objectives. The study sample consisted of 74 male and female students at the…
Descriptors: Achievement Tests, Computer Assisted Testing, Psychometrics, Item Response Theory
Hryvko, Antonina V.; Zhuk, Yurii O. – Journal of Curriculum and Teaching, 2022
A feature of the presented study is a comprehensive approach to studying the reliability problem of linguistic testing results due to the several functional and variable factors impact. Contradictions and ambiguous views of scientists on the researched issues determine the relevance of this study. The article highlights the problem of equivalence…
Descriptors: Student Evaluation, Language Tests, Test Format, Test Items
Betts, Joe; Muntean, William; Kim, Doyoung; Kao, Shu-chuan – Educational and Psychological Measurement, 2022
The multiple response structure can underlie several different technology-enhanced item types. With the increased use of computer-based testing, multiple response items are becoming more common. This response type holds the potential for being scored polytomously for partial credit. However, there are several possible methods for computing raw…
Descriptors: Scoring, Test Items, Test Format, Raw Scores
Guo, Hongwen; Rios, Joseph A.; Ling, Guangming; Wang, Zhen; Gu, Lin; Yang, Zhitong; Liu, Lydia O. – ETS Research Report Series, 2022
Different variants of the selected-response (SR) item type have been developed for various reasons (i.e., simulating realistic situations, examining critical-thinking and/or problem-solving skills). Generally, the variants of SR item format are more complex than the traditional multiple-choice (MC) items, which may be more challenging to test…
Descriptors: Test Format, Test Wiseness, Test Items, Item Response Theory
Tam, Angela Choi Fung – Assessment & Evaluation in Higher Education, 2022
Students' perception and learning practices about online timed take-home examinations and the factors affecting students' learning practices in the presence of COVID-19 have largely been unexplored. Nine students of arts, business and science sub-degree programmes participated in this study. Semi-structured interviews and reflective journals were…
Descriptors: Foreign Countries, Two Year College Students, Student Attitudes, COVID-19
Abdullah Al Fraidan; Meznah Saud Abdulaziz Alsubaie – Educational Process: International Journal, 2025
Background: This study examines the effect of test anxiety on the academic performance of postgraduate female students, focusing on their perceptions and experiences in open-book exams (OBE) and closed-book exams (CBE). Method: A qualitative case study design was employed using the Thinking Aloud Protocol (TAP) to collect data from five Saudi…
Descriptors: Test Anxiety, Vocabulary, Females, Books
Kárász, Judit T.; Széll, Krisztián; Takács, Szabolcs – Quality Assurance in Education: An International Perspective, 2023
Purpose: Based on the general formula, which depends on the length and difficulty of the test, the number of respondents and the number of ability levels, this study aims to provide a closed formula for the adaptive tests with medium difficulty (probability of solution is p = 1/2) to determine the accuracy of the parameters for each item and in…
Descriptors: Test Length, Probability, Comparative Analysis, Difficulty Level

Peer reviewed
Direct link
