Publication Date
In 2025 | 1 |
Since 2024 | 6 |
Since 2021 (last 5 years) | 11 |
Since 2016 (last 10 years) | 14 |
Since 2006 (last 20 years) | 17 |
Descriptor
Source
Language Testing | 17 |
Author
Alina A. von Davier | 1 |
Aryadoust, Vahid | 1 |
Averil Coxhead | 1 |
Bax, Stephen | 1 |
Coombe, Christine | 1 |
Davidson, Peter | 1 |
Emma Bruce | 1 |
Foo, Stacy | 1 |
Harding, Luke | 1 |
Hitoshi Nishizawa | 1 |
Hu, Ruolin | 1 |
More ▼ |
Publication Type
Journal Articles | 17 |
Reports - Research | 12 |
Reports - Evaluative | 4 |
Tests/Questionnaires | 3 |
Information Analyses | 1 |
Education Level
Higher Education | 10 |
Postsecondary Education | 10 |
Grade 12 | 1 |
Secondary Education | 1 |
Audience
Location
United Kingdom | 3 |
Australia | 2 |
New Zealand | 2 |
Canada | 1 |
Europe | 1 |
Iran (Tehran) | 1 |
Malaysia | 1 |
United Arab Emirates | 1 |
United Kingdom (England) | 1 |
United Kingdom (London) | 1 |
United States | 1 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
International English… | 17 |
Test of English as a Foreign… | 10 |
ACT Assessment | 1 |
Edinburgh Handedness Inventory | 1 |
Test of English for… | 1 |
What Works Clearinghouse Rating
Emma Bruce; Karen Dunn; Tony Clark – Language Testing, 2025
Several high-stakes English proficiency tests including but not limited to IELTS, PTE Academic, and TOEFL iBT recommend a 2-year time limit on validity for score usage. Although this timeframe provides a useful rule-of-thumb for the recency of testing, it can have far-reaching consequences. In response to stakeholder queries around IELTS validity…
Descriptors: High Stakes Tests, Language Tests, Test Validity, Scores
Ramsey L. Cardwell; Steven W. Nydick; J.R. Lockwood; Alina A. von Davier – Language Testing, 2024
Applicants must often demonstrate adequate English proficiency when applying to postsecondary institutions by taking an English language proficiency test, such as the TOEFL iBT, IELTS Academic, or Duolingo English Test (DET). Concordance tables aim to provide equivalent scores across multiple assessments, helping admissions officers to make fair…
Descriptors: Second Language Learning, English (Second Language), Language Tests, Language Proficiency
Thi My Hang Nguyen; Peter Gu; Averil Coxhead – Language Testing, 2024
Despite extensive research on assessing collocational knowledge, valid measures of academic collocations remain elusive. With the present study, we employ an argument-based approach to validate two Academic Collocation Tests (ACTs) that assess the ability to recognize and produce academic collocations (i.e., two-word units such as "key…
Descriptors: Foreign Countries, College Students, College Entrance Examinations, English (Second Language)
Michael D. Carey; Stefan Szocs – Language Testing, 2024
This controlled experimental study investigated the interaction of variables associated with rating the pronunciation component of high-stakes English-language-speaking tests such as IELTS and TOEFL iBT. One hundred experienced raters who were all either familiar or unfamiliar with Brazilian-accented English or Papua New Guinean Tok Pisin-accented…
Descriptors: Dialects, Pronunciation, Suprasegmentals, Familiarity
Pearson, William S. – Language Testing, 2023
Many candidates undertaking high-stakes English language proficiency tests for academic enrolment do not achieve the results they need for reasons including linguistic unreadiness, test unpreparedness, illness, an unfavourable configuration of tasks, or administrative and marking errors. Owing to the importance of meeting goals or out of a belief…
Descriptors: High Stakes Tests, English (Second Language), Language Proficiency, Language Tests
Hitoshi Nishizawa – Language Testing, 2024
Corpus-based studies have offered the domain definition inference for test developers. Yet, corpus-based studies on temporal fluency measures (e.g., speech rate) have been limited, especially in the context of academic lecture settings. This made it difficult for test developers to sample representative fluency features to create authentic…
Descriptors: High Stakes Tests, Language Tests, Second Language Learning, Computer Assisted Testing
Isaacs, Talia; Hu, Ruolin; Trenkic, Danijela; Varga, Julia – Language Testing, 2023
The COVID-19 pandemic has changed the university admissions and proficiency testing landscape. One change has been the meteoric rise in use of the fully automated Duolingo English Test (DET) for university entrance purposes, offering test-takers a cheaper, shorter, accessible alternative. This rapid response study is the first to investigate the…
Descriptors: Predictive Validity, Educational Technology, Handheld Devices, Language Tests
Ihlenfeldt, Samuel Dale; Rios, Joseph A. – Language Testing, 2023
For institutions where English is the primary language of instruction, English assessments for admissions such as the Test of English as a Foreign Language (TOEFL) and International English Language Testing System (IELTS) give admissions decision-makers a sense of a student's skills in academic English. Despite this explicit purpose, these exams…
Descriptors: Meta Analysis, Test Validity, College Admission, Second Language Learning
Isbell, Daniel R.; Kremmel, Benjamin – Language Testing, 2020
Administration of high-stakes language proficiency tests has been disrupted in many parts of the world as a result of the 2019 novel coronavirus pandemic. Institutions that rely on test scores have been forced to adapt, and in many cases this means using scores from a different test, or a new online version of an existing test, that can be taken…
Descriptors: Language Tests, High Stakes Tests, Language Proficiency, Second Language Learning
J. Dylan Burton – Language Testing, 2024
Nonverbal behavior can impact language proficiency scores in speaking tests, but there is little empirical information of the size or consistency of its effects or whether language proficiency may be a moderating variable. In this study, 100 novice raters watched and scored 30 recordings of test takers taking an international, high stakes…
Descriptors: Nonverbal Ability, Language Fluency, Second Language Learning, Language Proficiency
Aryadoust, Vahid; Foo, Stacy; Ng, Li Ying – Language Testing, 2022
The aim of this study was to investigate how test methods affect listening test takers' performance and cognitive load. Test methods were defined and operationalized as while-listening performance (WLP) and post-listening performance (PLP) formats. To achieve the goal of the study, we examined test takers' (N = 80) brain activity patterns…
Descriptors: Listening Comprehension Tests, Language Tests, Eye Movements, Brain Hemisphere Functions
Miao, Yongzhi – Language Testing, 2023
Scholars have argued for the inclusion of different spoken varieties of English in high-stakes listening tests to better represent the global use of English. However, doing so may introduce additional construct-irrelevant variance due to accent familiarity and the shared first language (L1) advantage, which could threaten test fairness. However,…
Descriptors: Pronunciation, Metalinguistics, Native Language, Intelligibility
Khabbazbashi, Nahal – Language Testing, 2017
This study explores the extent to which topic and background knowledge of topic affect spoken performance in a high-stakes speaking test. It is argued that evidence of a substantial influence may introduce construct-irrelevant variance and undermine test fairness. Data were collected from 81 non-native speakers of English who performed on 10…
Descriptors: Speech Tests, High Stakes Tests, English (Second Language), Language Proficiency
Römer, Ute – Language Testing, 2017
This paper aims to connect recent corpus research on phraseology with current language testing practice. It discusses how corpora and corpus-analytic techniques can illuminate central aspects of speech and help in conceptualizing the notion of lexicogrammar in second language speaking assessment. The description of speech and some of its core…
Descriptors: Language Tests, Grammar, English (Second Language), Second Language Learning
Coombe, Christine; Davidson, Peter – Language Testing, 2014
The Common Educational Proficiency Assessment (CEPA) is a large-scale, high-stakes, English language proficiency/placement test administered in the United Arab Emirates to Emirati nationals in their final year of secondary education or Grade 12. The purpose of the CEPA is to place students into English classes at the appropriate government…
Descriptors: Language Tests, High Stakes Tests, English (Second Language), Second Language Learning
Previous Page | Next Page »
Pages: 1 | 2