Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 5 |
| Since 2007 (last 20 years) | 21 |
Descriptor
| High School Students | 21 |
| Intelligent Tutoring Systems | 21 |
| Natural Language Processing | 21 |
| Essays | 10 |
| Feedback (Response) | 7 |
| Reading Comprehension | 7 |
| Writing Evaluation | 7 |
| Reading Tests | 6 |
| Writing Skills | 6 |
| Educational Technology | 5 |
| Pretests Posttests | 5 |
| More ▼ | |
Source
| Grantee Submission | 14 |
| International Educational… | 3 |
| Journal of Educational… | 2 |
| Journal of Educational Data… | 1 |
| Society for Research on… | 1 |
Author
| McNamara, Danielle S. | 14 |
| Allen, Laura K. | 8 |
| Snow, Erica L. | 7 |
| Katz, Sandra | 5 |
| Crossley, Scott A. | 4 |
| Jordan, Pamela | 3 |
| Varner, Laura K. | 3 |
| Albacete, Patricia | 2 |
| Albacete, Patricia L. | 2 |
| Crossley, Scott | 2 |
| Likens, Aaron D. | 2 |
| More ▼ | |
Publication Type
| Reports - Research | 18 |
| Speeches/Meeting Papers | 11 |
| Journal Articles | 4 |
| Tests/Questionnaires | 2 |
| Collected Works - Proceedings | 1 |
| Reports - Descriptive | 1 |
| Reports - Evaluative | 1 |
Education Level
| High Schools | 20 |
| Secondary Education | 17 |
| Higher Education | 3 |
| Postsecondary Education | 3 |
| Elementary Education | 2 |
| Grade 8 | 2 |
| Junior High Schools | 2 |
| Middle Schools | 2 |
| Grade 4 | 1 |
| Grade 5 | 1 |
| Grade 6 | 1 |
| More ▼ | |
Audience
| Researchers | 1 |
| Teachers | 1 |
Laws, Policies, & Programs
Assessments and Surveys
| Gates MacGinitie Reading Tests | 7 |
| Writing Apprehension Test | 1 |
What Works Clearinghouse Rating
Crossley, Scott; Kyle, Kristopher; Davenport, Jodi; McNamara, Danielle S. – International Educational Data Mining Society, 2016
This study introduces the Constructed Response Analysis Tool (CRAT), a freely available tool to automatically assess student responses in online tutoring systems. The study tests CRAT on a dataset of chemistry responses collected in the ChemVLab+. The findings indicate that CRAT can differentiate and classify student responses based on semantic…
Descriptors: Intelligent Tutoring Systems, Chemistry, Natural Language Processing, High School Students
Jordan, Pamela; Albacete, Patricia; Katz, Sandra – Grantee Submission, 2016
We explore the effectiveness of a simple algorithm for adaptively deciding whether to further decompose a step in a line of reasoning during tutorial dialogue. We compare two versions of a tutorial dialogue system, Rimac: one that always decomposes a step to its simplest sub-steps and one that adaptively decides to decompose a step based on a…
Descriptors: Algorithms, Decision Making, Intelligent Tutoring Systems, Scaffolding (Teaching Technique)
Allen, Laura K.; Likens, Aaron D.; McNamara, Danielle S. – Grantee Submission, 2018
The assessment of writing proficiency generally includes analyses of the specific linguistic and rhetorical features contained in the singular essays produced by students. However, researchers have recently proposed that an individual's ability to flexibly adapt the linguistic properties of their writing might more closely capture writing skill.…
Descriptors: Writing Evaluation, Writing Tests, Computer Assisted Testing, Writing Skills
Katz, Sandra; Albacete, Patricia; Jordan, Pamela – Grantee Submission, 2016
This poster reports on a study that compared three types of summaries at the end of natural-language tutorial dialogues and a no-dialogue control, to determine which type of summary, if any, best predicted learning gains. Although we found no significant differences between conditions, analyses of gender differences indicate that female students…
Descriptors: Natural Language Processing, Intelligent Tutoring Systems, Reflection, Dialogs (Language)
Allen, Laura K.; Likens, Aaron D.; McNamara, Danielle S. – Grantee Submission, 2018
The assessment of argumentative writing generally includes analyses of the specific linguistic and rhetorical features contained in the individual essays produced by students. However, researchers have recently proposed that an individual's ability to flexibly adapt the linguistic properties of their writing may more accurately capture their…
Descriptors: Writing (Composition), Persuasive Discourse, Essays, Language Usage
Michalenko, Joshua J.; Lan, Andrew S.; Waters, Andrew E.; Grimaldi, Philip J.; Baraniuk, Richard G. – International Educational Data Mining Society, 2017
An important, yet largely unstudied problem in student data analysis is to detect "misconceptions" from students' responses to "open-response" questions. Misconception detection enables instructors to deliver more targeted feedback on the misconceptions exhibited by many students in their class, thus improving the quality of…
Descriptors: Data Analysis, Misconceptions, Student Attitudes, Feedback (Response)
Crossley, Scott; Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Grantee Submission, 2015
This study investigates a new approach to automatically assessing essay quality that combines traditional approaches based on assessing textual features with new approaches that measure student attributes such as demographic information, standardized test scores, and survey results. The results demonstrate that combining both text features and…
Descriptors: Automation, Scoring, Essays, Evaluation Methods
Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Grantee Submission, 2015
This study builds upon previous work aimed at developing a student model of reading comprehension ability within the intelligent tutoring system, iSTART. Currently, the system evaluates students' self-explanation performance using a local, sentence-level algorithm and does not adapt content based on reading ability. The current study leverages…
Descriptors: Reading Comprehension, Reading Skills, Natural Language Processing, Intelligent Tutoring Systems
Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Grantee Submission, 2014
In the current study, we utilize natural language processing techniques to examine relations between the linguistic properties of students' self-explanations and their reading comprehension skills. Linguistic features of students' aggregated self-explanations were analyzed using the Linguistic Inquiry and Word Count (LIWC) software. Results…
Descriptors: Natural Language Processing, Reading Comprehension, Linguistics, Predictor Variables
Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Grantee Submission, 2016
A commonly held belief among educators, researchers, and students is that high-quality texts are easier to read than low-quality texts, as they contain more engaging narrative and story-like elements. Interestingly, these assumptions have typically failed to be supported by the literature on writing. Previous research suggests that higher quality…
Descriptors: Role, Writing (Composition), Natural Language Processing, Hypothesis Testing
Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Journal of Educational Psychology, 2016
A commonly held belief among educators, researchers, and students is that high-quality texts are easier to read than low-quality texts, as they contain more engaging narrative and story-like elements. Interestingly, these assumptions have typically failed to be supported by the literature on writing. Previous research suggests that higher quality…
Descriptors: Role, Writing (Composition), Natural Language Processing, Hypothesis Testing
Crossley, Scott A.; McNamara, Danielle S. – Grantee Submission, 2014
This study explores correlations between human ratings of essay quality and component scores based on similar natural language processing indices and weighted through a principal component analysis. The results demonstrate that such component scores show small to large effects with human ratings and thus may be suitable to providing both summative…
Descriptors: Essays, Computer Assisted Testing, Writing Evaluation, Scores
Varner, Laura K.; Jackson, G. Tanner; Snow, Erica L.; McNamara, Danielle S. – Grantee Submission, 2013
This study expands upon an existing model of students' reading comprehension ability within an intelligent tutoring system. The current system evaluates students' natural language input using a local student model. We examine the potential to expand this model by assessing the linguistic features of self-explanations aggregated across entire…
Descriptors: Reading Comprehension, Intelligent Tutoring Systems, Natural Language Processing, Reading Ability
Crossley, Scott A.; Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Journal of Educational Data Mining, 2016
This study investigates a novel approach to automatically assessing essay quality that combines natural language processing approaches that assess text features with approaches that assess individual differences in writers such as demographic information, standardized test scores, and survey results. The results demonstrate that combining text…
Descriptors: Essays, Scoring, Writing Evaluation, Natural Language Processing
Jacovina, Matthew E.; McNamara, Danielle S. – Grantee Submission, 2017
In this chapter, we describe several intelligent tutoring systems (ITSs) designed to support student literacy through reading comprehension and writing instruction and practice. Although adaptive instruction can be a powerful tool in the literacy domain, developing these technologies poses significant challenges. For example, evaluating the…
Descriptors: Intelligent Tutoring Systems, Literacy Education, Educational Technology, Technology Uses in Education
Previous Page | Next Page »
Pages: 1 | 2
Peer reviewed
Direct link
