NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing all 10 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Gruss, Richard; Clemons, Josh – Journal of Computer Assisted Learning, 2023
Background: The sudden growth in online instruction due to COVID-19 restrictions has given renewed urgency to questions about remote learning that have remained unresolved. Web-based assessment software provides instructors an array of options for varying testing parameters, but the pedagogical impacts of some of these variations has yet to be…
Descriptors: Test Items, Test Format, Computer Assisted Testing, Mathematics Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Esin Yilmaz Kogar; Sumeyra Soysal – International Journal of Assessment Tools in Education, 2023
In this paper, it is aimed to evaluate different aspects of students' response time to items in the mathematics test and their test effort as an indicator of test motivation with the help of some variables at the item and student levels. The data consists of 4th-grade Singapore and Turkish students participating in the TIMSS 2019. Response time…
Descriptors: Foreign Countries, Achievement Tests, International Assessment, Mathematics Achievement
Peer reviewed Peer reviewed
Direct linkDirect link
Perkins, Beth A.; Satkus, Paulius; Finney, Sara J. – Journal of Psychoeducational Assessment, 2020
Few studies have examined the psychometric properties of the test-related items from the Achievement Emotions Questionnaire (AEQ). Using a sample of 955 university students, we examined the factor structure of 12 emotion items measuring test-related anger, boredom, enjoyment, and pride. Results indicated the four emotions were distinct, allowing…
Descriptors: Affective Measures, Questionnaires, Psychometrics, Test Items
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Ilgun Dibek, Munevver – International Journal of Educational Methodology, 2021
Response times are one of the important sources that provide information about the performance of individuals during a test process. The main purpose of this study is to show that survival models can be used in educational data. Accordingly, data sets of items measuring literacy, numeracy and problem-solving skills of the countries participating…
Descriptors: Reaction Time, Test Items, Adults, Foreign Countries
Peer reviewed Peer reviewed
Direct linkDirect link
Pastor, Dena A.; Ong, Thai Q.; Strickman, Scott N. – Educational Assessment, 2019
The trustworthiness of low-stakes assessment results largely depends on examinee effort, which can be measured by the amount of time examinees devote to items using solution behavior (SB) indices. Because SB indices are calculated for each item, they can be used to understand how examinee motivation changes across items within a test. Latent class…
Descriptors: Behavior Patterns, Test Items, Time, Response Style (Tests)
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Zhou, Jiawen; Cao, Yi – ETS Research Report Series, 2020
In this study, we explored retest effects on test scores and response time for repeaters, examinees who retake an examination. We looked at two groups of repeaters: those who took the same form twice and those who took different forms on their two attempts for a certification and licensure test. Scores improved over the two test attempts, and…
Descriptors: Testing, Test Items, Computer Assisted Testing, Licensing Examinations (Professions)
Peer reviewed Peer reviewed
Direct linkDirect link
Noam, Gil G.; Allen, Patricia J.; Sonnert, Gerhard; Sadler, Philip M. – International Journal of Science Education, Part B: Communication and Public Engagement, 2020
There has been a growing need felt by practitioners, researchers, and evaluators to obtain a common measure of science engagement that can be used in different out-of-school time (OST) science learning settings. We report on the development and validation of a novel 10-item self-report instrument designed to measure, communicate, and ultimately…
Descriptors: Leisure Time, Elementary School Students, Middle School Students, After School Programs
Peer reviewed Peer reviewed
Direct linkDirect link
Hess, Brian J.; Johnston, Mary M.; Lipner, Rebecca S. – International Journal of Testing, 2013
Current research on examination response time has focused on tests comprised of traditional multiple-choice items. Consequently, the impact of other innovative or complex item formats on examinee response time is not understood. The present study used multilevel growth modeling to investigate examinee characteristics associated with response time…
Descriptors: Test Items, Test Format, Reaction Time, Individual Characteristics
Peer reviewed Peer reviewed
Direct linkDirect link
DeMars, Christine E.; Wise, Steven L. – International Journal of Testing, 2010
This investigation examined whether different rates of rapid guessing between groups could lead to detectable levels of differential item functioning (DIF) in situations where the item parameters were the same for both groups. Two simulation studies were designed to explore this possibility. The groups in Study 1 were simulated to reflect…
Descriptors: Guessing (Tests), Test Bias, Motivation, Gender Differences
Gallagher, Ann M. – College Entrance Examination Board, 1990
Performance of high-scoring males and females on the mathematics section of three forms of the College Board's Scholastic Aptitude Test (SAT-M) was examined to determine how item content, solution strategy, and speededness differentially affect performance. The mathematical and verbal sections of the SAT were also compared for similarities in the…
Descriptors: College Entrance Examinations, Gender Differences, Scores, Mathematics Tests