NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 7 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Yulianto, Ahmad; Pudjitriherwanti, Anastasia; Kusumah, Chevy; Oktavia, Dies – International Journal of Language Testing, 2023
The increasing use of computer-based mode in language testing raises concern over its similarities with and differences from paper-based format. The present study aimed to delineate discrepancies between TOEFL PBT and CBT. For that objective, a quantitative method was employed to probe into scores equivalence, the performance of male-female…
Descriptors: Computer Assisted Testing, Test Format, Comparative Analysis, Scores
Peer reviewed Peer reviewed
Direct linkDirect link
Akhavan Masoumi, Ghazal; Sadeghi, Karim – Language Testing in Asia, 2020
This study aimed to examine the effect of test format on test performance by comparing Multiple Choice (MC) and Constructed Response (CR) vocabulary tests in an EFL setting. Also, this paper investigated the function of gender in MC and CR vocabulary measures. To this end, five 20-item stem-equivalent vocabulary tests (CR, and 3-, 4-, 5-, and…
Descriptors: Language Tests, Test Items, English (Second Language), Second Language Learning
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Polat, Murat – Novitas-ROYAL (Research on Youth and Language), 2020
Classroom practices, materials and teaching methods in language classes have changed a lot in the last decades and continue to evolve; however, the commonly used techniques to test students' foreign language skills have not changed much regardless of the recent awareness in Bloom's taxonomy. Testing units at schools rely mostly on multiple choice…
Descriptors: Multiple Choice Tests, Test Format, Test Items, Difficulty Level
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Khoshsima, Hooshang; Hosseini, Monirosadat; Toroujeni, Seyyed Morteza Hashemi – English Language Teaching, 2017
Advent of technology has caused growing interest in using computers to convert conventional paper and pencil-based testing (Henceforth PPT) into Computer-based testing (Henceforth CBT) in the field of education during last decades. This constant promulgation of computers to reshape the conventional tests into computerized format permeated the…
Descriptors: English (Second Language), Second Language Learning, Second Language Instruction, Correlation
Peer reviewed Peer reviewed
Direct linkDirect link
Pae, Hye K. – Educational Assessment, 2014
This study investigated the role of item formats in the performance of 206 nonnative speakers of English on expressive skills (i.e., speaking and writing). Test scores were drawn from the field test of the "Pearson Test of English Academic" for Chinese, French, Hebrew, and Korean native speakers. Four item formats, including…
Descriptors: Test Items, Test Format, Speech Skills, Writing Skills
Peer reviewed Peer reviewed
Direct linkDirect link
Pae, Tae-Il – Language Testing, 2012
This study tracked gender differential item functioning (DIF) on the English subtest of the Korean College Scholastic Aptitude Test (KCSAT) over a nine-year period across three data points, using both the Mantel-Haenszel (MH) and item response theory likelihood ratio (IRT-LR) procedures. Further, the study identified two factors (i.e. reading…
Descriptors: Aptitude Tests, Academic Aptitude, Language Tests, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Pae, Tae-Il – System: An International Journal of Educational Technology and Applied Linguistics, 2004
This paper examines the effect of gender on English reading comprehension for Korean EFL (English as a Foreign Language) learners. The gender effect was measured using a DIF (Differential Item Functioning) methodology. Specifically, gender DIF was investigated for a random sample of 14,000 Korean examinees (7,000 males and 7,000 females) who took…
Descriptors: Reading Comprehension, Test Format, Content Analysis, Gender Differences