Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 2 |
| Since 2017 (last 10 years) | 7 |
| Since 2007 (last 20 years) | 9 |
Descriptor
| Gender Differences | 10 |
| Test Items | 10 |
| Reaction Time | 5 |
| Computer Assisted Testing | 3 |
| Difficulty Level | 3 |
| Mathematics Tests | 3 |
| Scores | 3 |
| Time | 3 |
| Age Differences | 2 |
| Elementary School Students | 2 |
| Ethnicity | 2 |
| More ▼ | |
Source
Author
| Allen, Patricia J. | 1 |
| Cao, Yi | 1 |
| Clemons, Josh | 1 |
| DeMars, Christine E. | 1 |
| Esin Yilmaz Kogar | 1 |
| Finney, Sara J. | 1 |
| Gallagher, Ann M. | 1 |
| Gruss, Richard | 1 |
| Hess, Brian J. | 1 |
| Ilgun Dibek, Munevver | 1 |
| Johnston, Mary M. | 1 |
| More ▼ | |
Publication Type
| Reports - Research | 10 |
| Journal Articles | 9 |
| Tests/Questionnaires | 1 |
Education Level
| Higher Education | 4 |
| Postsecondary Education | 4 |
| Elementary Education | 2 |
| Secondary Education | 2 |
| Grade 4 | 1 |
| High Schools | 1 |
| Intermediate Grades | 1 |
| Junior High Schools | 1 |
| Middle Schools | 1 |
Audience
Location
| Ecuador | 1 |
| Hungary | 1 |
| Kazakhstan | 1 |
| Mexico | 1 |
| Peru | 1 |
| Singapore | 1 |
| Turkey | 1 |
| United States | 1 |
Laws, Policies, & Programs
Assessments and Surveys
| Program for the International… | 1 |
| SAT (College Admission Test) | 1 |
| Trends in International… | 1 |
What Works Clearinghouse Rating
Gruss, Richard; Clemons, Josh – Journal of Computer Assisted Learning, 2023
Background: The sudden growth in online instruction due to COVID-19 restrictions has given renewed urgency to questions about remote learning that have remained unresolved. Web-based assessment software provides instructors an array of options for varying testing parameters, but the pedagogical impacts of some of these variations has yet to be…
Descriptors: Test Items, Test Format, Computer Assisted Testing, Mathematics Tests
Esin Yilmaz Kogar; Sumeyra Soysal – International Journal of Assessment Tools in Education, 2023
In this paper, it is aimed to evaluate different aspects of students' response time to items in the mathematics test and their test effort as an indicator of test motivation with the help of some variables at the item and student levels. The data consists of 4th-grade Singapore and Turkish students participating in the TIMSS 2019. Response time…
Descriptors: Foreign Countries, Achievement Tests, International Assessment, Mathematics Achievement
Perkins, Beth A.; Satkus, Paulius; Finney, Sara J. – Journal of Psychoeducational Assessment, 2020
Few studies have examined the psychometric properties of the test-related items from the Achievement Emotions Questionnaire (AEQ). Using a sample of 955 university students, we examined the factor structure of 12 emotion items measuring test-related anger, boredom, enjoyment, and pride. Results indicated the four emotions were distinct, allowing…
Descriptors: Affective Measures, Questionnaires, Psychometrics, Test Items
Ilgun Dibek, Munevver – International Journal of Educational Methodology, 2021
Response times are one of the important sources that provide information about the performance of individuals during a test process. The main purpose of this study is to show that survival models can be used in educational data. Accordingly, data sets of items measuring literacy, numeracy and problem-solving skills of the countries participating…
Descriptors: Reaction Time, Test Items, Adults, Foreign Countries
Pastor, Dena A.; Ong, Thai Q.; Strickman, Scott N. – Educational Assessment, 2019
The trustworthiness of low-stakes assessment results largely depends on examinee effort, which can be measured by the amount of time examinees devote to items using solution behavior (SB) indices. Because SB indices are calculated for each item, they can be used to understand how examinee motivation changes across items within a test. Latent class…
Descriptors: Behavior Patterns, Test Items, Time, Response Style (Tests)
Zhou, Jiawen; Cao, Yi – ETS Research Report Series, 2020
In this study, we explored retest effects on test scores and response time for repeaters, examinees who retake an examination. We looked at two groups of repeaters: those who took the same form twice and those who took different forms on their two attempts for a certification and licensure test. Scores improved over the two test attempts, and…
Descriptors: Testing, Test Items, Computer Assisted Testing, Licensing Examinations (Professions)
Noam, Gil G.; Allen, Patricia J.; Sonnert, Gerhard; Sadler, Philip M. – International Journal of Science Education, Part B: Communication and Public Engagement, 2020
There has been a growing need felt by practitioners, researchers, and evaluators to obtain a common measure of science engagement that can be used in different out-of-school time (OST) science learning settings. We report on the development and validation of a novel 10-item self-report instrument designed to measure, communicate, and ultimately…
Descriptors: Leisure Time, Elementary School Students, Middle School Students, After School Programs
Hess, Brian J.; Johnston, Mary M.; Lipner, Rebecca S. – International Journal of Testing, 2013
Current research on examination response time has focused on tests comprised of traditional multiple-choice items. Consequently, the impact of other innovative or complex item formats on examinee response time is not understood. The present study used multilevel growth modeling to investigate examinee characteristics associated with response time…
Descriptors: Test Items, Test Format, Reaction Time, Individual Characteristics
DeMars, Christine E.; Wise, Steven L. – International Journal of Testing, 2010
This investigation examined whether different rates of rapid guessing between groups could lead to detectable levels of differential item functioning (DIF) in situations where the item parameters were the same for both groups. Two simulation studies were designed to explore this possibility. The groups in Study 1 were simulated to reflect…
Descriptors: Guessing (Tests), Test Bias, Motivation, Gender Differences
Gallagher, Ann M. – College Entrance Examination Board, 1990
Performance of high-scoring males and females on the mathematics section of three forms of the College Board's Scholastic Aptitude Test (SAT-M) was examined to determine how item content, solution strategy, and speededness differentially affect performance. The mathematical and verbal sections of the SAT were also compared for similarities in the…
Descriptors: College Entrance Examinations, Gender Differences, Scores, Mathematics Tests

Peer reviewed
Direct link
