NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Location
Massachusetts1
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing all 11 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Lee, Dukjae; Buzick, Heather; Sireci, Stephen G.; Lee, Mina; Laitusis, Cara – Practical Assessment, Research & Evaluation, 2021
Although there has been substantial research on the effects of test accommodations on students' performance, there has been far less research on students' use of embedded accommodations and other accessibility supports at the item and whole test level in operational testing programs. Data on embedded accessibility supports from digital logs…
Descriptors: Academic Accommodations (Disabilities), Testing Accommodations, Accessibility (for Disabled), Computer Assisted Testing
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Araneda, Sergio; Lee, Dukjae; Lewis, Jennifer; Sireci, Stephen G.; Moon, Jung Aa; Lehman, Blair; Arslan, Burcu; Keehner, Madeleine – Education Sciences, 2022
Students exhibit many behaviors when responding to items on a computer-based test, but only some of these behaviors are relevant to estimating their proficiencies. In this study, we analyzed data from computer-based math achievement tests administered to elementary school students in grades 3 (ages 8-9) and 4 (ages 9-10). We investigated students'…
Descriptors: Student Behavior, Academic Achievement, Computer Assisted Testing, Mathematics Achievement
Peer reviewed Peer reviewed
Direct linkDirect link
Roohr, Katrina Crotts; Sireci, Stephen G. – Educational Assessment, 2017
Test accommodations for English learners (ELs) are intended to reduce the language barrier and level the playing field, allowing ELs to better demonstrate their true proficiencies. Computer-based accommodations for ELs show promising results for leveling that field while also providing us with additional data to more closely investigate the…
Descriptors: Testing Accommodations, English Language Learners, Second Language Learning, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Crotts, Katrina; Sireci, Stephen G.; Zenisky, April – Journal of Applied Testing Technology, 2012
Validity evidence based on test content is important for educational tests to demonstrate the degree to which they fulfill their purposes. Most content validity studies involve subject matter experts (SMEs) who rate items that comprise a test form. In computerized-adaptive testing, examinees take different sets of items and test "forms"…
Descriptors: Computer Assisted Testing, Adaptive Testing, Content Validity, Test Content
Luecht, Richard M.; Sireci, Stephen G. – College Board, 2011
Over the past four decades, there has been incremental growth in computer-based testing (CBT) as a viable alternative to paper-and-pencil testing. However, the transition to CBT is neither easy nor inexpensive. As Drasgow, Luecht, and Bennett (2006) noted, many design engineering, test development, operations/logistics, and psychometric changes…
Descriptors: College Entrance Examinations, Computer Assisted Testing, Educational Technology, Evaluation Methods
Peer reviewed Peer reviewed
Huff, Kristen L.; Sireci, Stephen G. – Educational Measurement: Issues and Practice, 2001
Discusses the potential positive and negative effects computer-based testing could have on validity, reviews the literature on validation perspectives in computer-based testing, and suggests ways to evaluate the contributions of computer-based testing to more valid measurement practices. (SLD)
Descriptors: Adaptive Testing, Computer Assisted Testing, Elementary Secondary Education, Validity
Hambleton, Ronald K.; Sireci, Stephen G.; Swaminathan, H.; Xing, Dehui; Rizavi, Saba – 2003
The purposes of this research study were to develop and field test anchor-based judgmental methods for enabling test specialists to estimate item difficulty statistics. The study consisted of three related field tests. In each, researchers worked with six Law School Admission Test (LSAT) test specialists and one or more of the LSAT subtests. The…
Descriptors: Adaptive Testing, College Entrance Examinations, Computer Assisted Testing, Difficulty Level
Sireci, Stephen G.; Patelis, Thanos; Rizavi, Saba; Dillingham, Alan M.; Rodriguez, Georgette – 2000
Setting standards on educational tests is extremely challenging. The psychometric literature is replete with methods and guidelines for setting standards on educational tests; however, little attention has been paid to the process of setting standards on computerized adaptive tests (CATs). This lack of attention is unfortunate because CATs are…
Descriptors: Adaptive Testing, College Bound Students, Computer Assisted Testing, Higher Education
Sireci, Stephen G.; Foster, David F.; Robin, Frederic; Olsen, James – 1997
Evaluating the comparability of a test administered in different languages is a difficult, if not impossible, task. Comparisons are problematic because observed differences in test performance between groups who take different language versions of a test could be due to a difference in difficulty between the tests, to cultural differences in test…
Descriptors: Adaptive Testing, Adults, Certification, Comparative Analysis
Peer reviewed Peer reviewed
Sireci, Stephen G.; Hambleton, Ronald K. – International Journal of Educational Research, 1997
Achievement testing in the next century is going to be very different. Computer technology is going to play a major role in test construction, test administration, scoring, and score reporting. New formats will become possible that incorporate visual and audio components and that permit adaption of tests to individual ability levels. (SLD)
Descriptors: Achievement Tests, Adaptive Testing, Computer Assisted Testing, Criterion Referenced Tests
Peer reviewed Peer reviewed
Sireci, Stephen G.; Harter, James; Yang, Yongwei; Bhola, Dennison – International Journal of Testing, 2003
Evaluated the structural equivalence and differential item functioning of an employee attitude survey from a large international corporation across three languages, eight cultures, and two mediums of administration. Results for 40,595 employees show the structure of survey data was consistent and items functioned similarly across all groups. (SLD)
Descriptors: Attitude Measures, Computer Assisted Testing, Cross Cultural Studies, Employees