Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 4 |
| Since 2007 (last 20 years) | 9 |
Descriptor
| Gender Differences | 9 |
| Item Response Theory | 9 |
| Test Format | 9 |
| College Entrance Examinations | 4 |
| Foreign Countries | 4 |
| Multiple Choice Tests | 4 |
| Test Items | 4 |
| Computer Assisted Testing | 3 |
| Science Achievement | 3 |
| Science Tests | 3 |
| Secondary School Students | 3 |
| More ▼ | |
Source
| ACT, Inc. | 1 |
| Applied Psychological… | 1 |
| British Journal of… | 1 |
| Educational and Psychological… | 1 |
| Grantee Submission | 1 |
| Journal of Vocational Behavior | 1 |
| Language Testing | 1 |
| Research in Science &… | 1 |
| Science Education… | 1 |
Author
| Hudson, Ross D. | 2 |
| Cho, YoungWoo | 1 |
| DeBoer, George E. | 1 |
| Dorans, Neil J. | 1 |
| Einarsdottir, Sif | 1 |
| Glas, Cees | 1 |
| Hamhuis, Eva | 1 |
| Hammond, Shelby | 1 |
| Hardcastle, Joseph | 1 |
| Herrmann-Abell, Cari F. | 1 |
| Lee, HyeSun | 1 |
| More ▼ | |
Publication Type
| Journal Articles | 7 |
| Reports - Research | 6 |
| Reports - Evaluative | 3 |
| Numerical/Quantitative Data | 1 |
| Speeches/Meeting Papers | 1 |
Education Level
| Higher Education | 3 |
| Elementary Education | 2 |
| Postsecondary Education | 2 |
| Secondary Education | 2 |
| Grade 4 | 1 |
Audience
Location
| Australia | 2 |
| Netherlands | 1 |
| South Korea | 1 |
Laws, Policies, & Programs
Assessments and Surveys
| ACT Assessment | 1 |
| SAT (College Admission Test) | 1 |
| Strong Interest Inventory | 1 |
| Trends in International… | 1 |
What Works Clearinghouse Rating
Lee, HyeSun; Smith, Weldon Z. – Educational and Psychological Measurement, 2020
Based on the framework of testlet models, the current study suggests the Bayesian random block item response theory (BRB IRT) model to fit forced-choice formats where an item block is composed of three or more items. To account for local dependence among items within a block, the BRB IRT model incorporated a random block effect into the response…
Descriptors: Bayesian Statistics, Item Response Theory, Monte Carlo Methods, Test Format
Hamhuis, Eva; Glas, Cees; Meelissen, Martina – British Journal of Educational Technology, 2020
Over the last two decades, the educational use of digital devices, including digital assessments, has become a regular feature of teaching in primary education in the Netherlands. However, researchers have not reached a consensus about the so-called "mode effect," which refers to the possible impact of using computer-based tests (CBT)…
Descriptors: Handheld Devices, Elementary School Students, Grade 4, Foreign Countries
Steedle, Jeffrey; Pashley, Peter; Cho, YoungWoo – ACT, Inc., 2020
Three mode comparability studies were conducted on the following Saturday national ACT test dates: October 26, 2019, December 14, 2019, and February 8, 2020. The primary goal of these studies was to evaluate whether ACT scores exhibited mode effects between paper and online testing that would necessitate statistical adjustments to the online…
Descriptors: Test Format, Computer Assisted Testing, College Entrance Examinations, Scores
Hardcastle, Joseph; Herrmann-Abell, Cari F.; DeBoer, George E. – Grantee Submission, 2017
Can student performance on computer-based tests (CBT) and paper-and-pencil tests (PPT) be considered equivalent measures of student knowledge? States and school districts are grappling with this question, and although studies addressing this question are growing, additional research is needed. We report on the performance of students who took…
Descriptors: Academic Achievement, Computer Assisted Testing, Comparative Analysis, Student Evaluation
Hudson, Ross D.; Treagust, David F. – Research in Science & Technological Education, 2013
Background: This study developed from observations of apparent achievement differences between male and female chemistry performances in a state university entrance examination. Male students performed more strongly than female students, especially in higher scores. Apart from the gender of the students, two other important factors that might…
Descriptors: Chemistry, College Entrance Examinations, State Universities, Gender Differences
Hudson, Ross D. – Science Education International, 2012
This research inquires into the effectiveness of the two predominant forms of questions--multiple-choice questions and short-answer questions--used in the State University Entrance Examination for Chemistry including the relationship between performance and gender. It examines not only the style of question but also the content type examined…
Descriptors: Chemistry, Science Achievement, Gender Differences, College Entrance Examinations
Pae, Tae-Il – Language Testing, 2012
This study tracked gender differential item functioning (DIF) on the English subtest of the Korean College Scholastic Aptitude Test (KCSAT) over a nine-year period across three data points, using both the Mantel-Haenszel (MH) and item response theory likelihood ratio (IRT-LR) procedures. Further, the study identified two factors (i.e. reading…
Descriptors: Aptitude Tests, Academic Aptitude, Language Tests, Test Items
Einarsdottir, Sif; Rounds, James – Journal of Vocational Behavior, 2009
Item response theory was used to address gender bias in interest measurement. Differential item functioning (DIF) technique, SIBTEST and DIMTEST for dimensionality, were applied to the items of the six General Occupational Theme (GOT) and 25 Basic Interest (BI) scales in the Strong Interest Inventory. A sample of 1860 women and 1105 men was used.…
Descriptors: Test Format, Females, Vocational Interests, Construct Validity
Dorans, Neil J.; Liu, Jinghua; Hammond, Shelby – Applied Psychological Measurement, 2008
This exploratory study was built on research spanning three decades. Petersen, Marco, and Stewart (1982) conducted a major empirical investigation of the efficacy of different equating methods. The studies reported in Dorans (1990) examined how different equating methods performed across samples selected in different ways. Recent population…
Descriptors: Test Format, Equated Scores, Sampling, Evaluation Methods

Peer reviewed
Direct link
