NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20260
Since 20250
Since 2022 (last 5 years)0
Since 2017 (last 10 years)1
Since 2007 (last 20 years)7
Audience
Location
New Jersey1
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing all 7 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Monteiro, Kátia R.; Crossley, Scott A.; Kyle, Kristopher – Applied Linguistics, 2020
Lexical items that are encountered more frequently and in varying contexts have important effects on second language (L2) development because frequent and contextually diverse words are learned faster and become more entrenched in a learner's lexicon (Ellis 2002a, b). Despite evidence that L2 learners are generally exposed to non-native input,…
Descriptors: English (Second Language), Language Tests, Second Language Learning, Benchmarking
Peer reviewed Peer reviewed
Direct linkDirect link
Campbell, Heather; Espin, Christine A.; McMaster, Kristen – Reading and Writing: An Interdisciplinary Journal, 2013
The purpose of this study was to examine the validity and reliability of Curriculum-Based Measures in writing for English learners. Participants were 36 high school English learners with moderate to high levels of English language proficiency. Predictor variables were type of writing prompt (picture, narrative, and expository), time (3, 5, and 7…
Descriptors: Curriculum Based Assessment, Writing Tests, Test Validity, Test Reliability
Peer reviewed Peer reviewed
Direct linkDirect link
Kim, Young-Suk; Al Otaiba, Stephanie; Wanzek, Jeanne; Gatlin, Brandy – Journal of Educational Psychology, 2015
We had 3 aims in the present study: (a) to examine the dimensionality of various evaluative approaches to scoring writing samples (e.g., quality, productivity, and curriculum-based measurement [CBM] writing scoring), (b) to investigate unique language and cognitive predictors of the identified dimensions, and (c) to examine gender gap in the…
Descriptors: Writing (Composition), Gender Differences, Curriculum Based Assessment, Scoring
Kim, Young-Suk; Al Otaiba, Stephanie; Wanzek, Jeanne; Gatlin, Brandy – Grantee Submission, 2015
We had 3 aims in the present study: (a) to examine the dimensionality of various evaluative approaches to scoring writing samples (e.g., quality, productivity, and curriculum-based measurement [CBM] writing scoring), (b) to investigate unique language and cognitive predictors of the identified dimensions, and (c) to examine gender gap in the…
Descriptors: Writing (Composition), Gender Differences, Curriculum Based Assessment, Scoring
Haberman, Shelby J. – Educational Testing Service, 2011
Alternative approaches are discussed for use of e-rater[R] to score the TOEFL iBT[R] Writing test. These approaches involve alternate criteria. In the 1st approach, the predicted variable is the expected rater score of the examinee's 2 essays. In the 2nd approach, the predicted variable is the expected rater score of 2 essay responses by the…
Descriptors: Writing Tests, Scoring, Essays, Language Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Katz, Irvin R.; Elliot, Norbert; Attali, Yigal; Scharf, Davida; Powers, Donald; Huey, Heather; Joshi, Kamal; Briller, Vladimir – ETS Research Report Series, 2008
This study presents an investigation of information literacy as defined by the ETS iSkills™ assessment and by the New Jersey Institute of Technology (NJIT) Information Literacy Scale (ILS). As two related but distinct measures, both iSkills and the ILS were used with undergraduate students at NJIT during the spring 2006 semester. Undergraduate…
Descriptors: Information Literacy, Information Skills, Skill Analysis, Case Studies
Peer reviewed Peer reviewed
Direct linkDirect link
Kobrin, Jennifer L.; Deng, Hui; Shaw, Emily J. – Journal of Applied Testing Technology, 2007
This study was designed to address two frequent criticisms of the SAT essay--that essay length is the best predictor of scores, and that there is an advantage in using more "sophisticated" examples as opposed to personal experience. The study was based on 2,820 essays from the first three administrations of the new SAT. Each essay was…
Descriptors: Testing Programs, Computer Assisted Testing, Construct Validity, Writing Skills