NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 22 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Yue Huang; Joshua Wilson – Journal of Computer Assisted Learning, 2025
Background: Automated writing evaluation (AWE) systems, used as formative assessment tools in writing classrooms, are promising for enhancing instruction and improving student performance. Although meta-analytic evidence supports AWE's effectiveness in various contexts, research on its effectiveness in the U.S. K-12 setting has lagged behind its…
Descriptors: Writing Evaluation, Writing Skills, Writing Tests, Writing Instruction
Yi Gui – ProQuest LLC, 2024
This study explores using transfer learning in machine learning for natural language processing (NLP) to create generic automated essay scoring (AES) models, providing instant online scoring for statewide writing assessments in K-12 education. The goal is to develop an instant online scorer that is generalizable to any prompt, addressing the…
Descriptors: Writing Tests, Natural Language Processing, Writing Evaluation, Scoring
Peer reviewed Peer reviewed
Direct linkDirect link
Potter, Andrew; Wilson, Joshua – Educational Technology Research and Development, 2021
Automated Writing Evaluation (AWE) provides automatic writing feedback and scoring to support student writing and revising. The purpose of the present study was to analyze a statewide implementation of an AWE software (n = 114,582) in grades 4-11. The goals of the study were to evaluate: (1) to what extent AWE features were used; (2) if equity and…
Descriptors: Computer Assisted Testing, Writing Evaluation, Feedback (Response), Scoring
Soohye Yeom – ProQuest LLC, 2023
With the wide introduction of English-medium instruction (EMI) to higher education institutions throughout East Asian countries, many East Asian universities are using English proficiency tests that were not originally designed for this context to make admissions and placement decisions. To support the use of these tests in this new EMI context,…
Descriptors: English (Second Language), Language Tests, Second Language Learning, Writing Tests
Sterett H. Mercer; Joanna E. Cannon – Grantee Submission, 2022
We evaluated the validity of an automated approach to learning progress assessment (aLPA) for English written expression. Participants (n = 105) were students in Grades 2-12 who had parent-identified learning difficulties and received academic tutoring through a community-based organization. Participants completed narrative writing samples in the…
Descriptors: Elementary School Students, Secondary School Students, Learning Problems, Learning Disabilities
College Board, 2023
Over the past several years, content experts, psychometricians, and researchers have been hard at work developing, refining, and studying the digital SAT. The work is grounded in foundational best practices and advances in measurement and assessment design, with fairness for students informing all of the work done. This paper shares learnings from…
Descriptors: College Entrance Examinations, Psychometrics, Computer Assisted Testing, Best Practices
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Choi, Ikkyu; Hao, Jiangang; Deane, Paul; Zhang, Mo – ETS Research Report Series, 2021
"Biometrics" are physical or behavioral human characteristics that can be used to identify a person. It is widely known that keystroke or typing dynamics for short, fixed texts (e.g., passwords) could serve as a behavioral biometric. In this study, we investigate whether keystroke data from essay responses can lead to a reliable…
Descriptors: Accuracy, High Stakes Tests, Writing Tests, Benchmarking
Yue Huang – ProQuest LLC, 2023
Automated writing evaluation (AWE) is a cutting-edge technology-based intervention designed to help teachers meet their challenges in writing classrooms and improve students' writing proficiency. The fast development of AWE systems, along with the encouragement of technology use in the U.S. K-12 education system by the Common Core State Standards…
Descriptors: Computer Assisted Testing, Writing Tests, Automation, Writing Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Tate, Tamara P.; Warschauer, Mark – Technology, Knowledge and Learning, 2019
The quality of students' writing skills continues to concern educators. Because writing is essential to success in both college and career, poor writing can have lifelong consequences. Writing is now primarily done digitally, but students receive limited explicit instruction in digital writing. This lack of instruction means that students fail to…
Descriptors: Writing Tests, Computer Assisted Testing, Writing Skills, Writing Processes
Peer reviewed Peer reviewed
PDF on ERIC Download full text
National Assessment of Educational Progress (NAEP), 2018
The National Assessment of Educational Progress (NAEP) is the largest nationally representative and continuing assessment of what the nation's students know and can do in various subjects such as mathematics, reading, science, and writing, as well as civics, geography, technology and engineering literacy, and U.S. history. The results of NAEP are…
Descriptors: School Districts, National Competency Tests, Computer Assisted Testing, Educational Assessment
Allen, Laura K.; Likens, Aaron D.; McNamara, Danielle S. – Grantee Submission, 2018
The assessment of writing proficiency generally includes analyses of the specific linguistic and rhetorical features contained in the singular essays produced by students. However, researchers have recently proposed that an individual's ability to flexibly adapt the linguistic properties of their writing might more closely capture writing skill.…
Descriptors: Writing Evaluation, Writing Tests, Computer Assisted Testing, Writing Skills
Peer reviewed Peer reviewed
Direct linkDirect link
Laurie, Robert; Bridglall, Beatrice L.; Arseneault, Patrick – SAGE Open, 2015
The effect of using a computer or paper and pencil on student writing scores on a provincial standardized writing assessment was studied. A sample of 302 francophone students wrote a short essay using a computer equipped with Microsoft Word with all of its correction functions enabled. One week later, the same students wrote a second short essay…
Descriptors: Writing Evaluation, Writing Tests, Computer Assisted Testing, Writing Achievement
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Evanini, Keelan; Heilman, Michael; Wang, Xinhao; Blanchard, Daniel – ETS Research Report Series, 2015
This report describes the initial automated scoring results that were obtained using the constructed responses from the Writing and Speaking sections of the pilot forms of the "TOEFL Junior"® Comprehensive test administered in late 2011. For all of the items except one (the edit item in the Writing section), existing automated scoring…
Descriptors: Computer Assisted Testing, Automation, Language Tests, Second Language Learning
Peer reviewed Peer reviewed
Direct linkDirect link
Ling, Guangming – International Journal of Testing, 2016
To investigate possible iPad related mode effect, we tested 403 8th graders in Indiana, Maryland, and New Jersey under three mode conditions through random assignment: a desktop computer, an iPad alone, and an iPad with an external keyboard. All students had used an iPad or computer for six months or longer. The 2-hour test included reading, math,…
Descriptors: Educational Testing, Computer Assisted Testing, Handheld Devices, Computers
Harris, Connie – ProQuest LLC, 2013
Despite the efforts of a number of national organizations focused on improving writing literacy, there has been little improvement in student writing skills over the past decade to keep pace with the growing demands of the workplace. Based on constructivist learning theory and the belief that students become better writers through continuous…
Descriptors: Middle School Students, Grade 8, Writing Instruction, Writing Evaluation
Previous Page | Next Page »
Pages: 1  |  2