ERIC Number: ED629937
Record Type: Non-Journal
Publication Date: 2022
Pages: 13
Abstractor: As Provided
ISBN: N/A
ISSN: N/A
EISSN: N/A
Available Date: N/A
Assessing Readability by Filling Cloze Items with Transformers
Grantee Submission, Paper presented at the International Conference on Artificial Intelligence in Education (22nd, Durham, UK, Jul 27-31, 2022)
Cloze items are a foundational approach to assessing readability. However, they require human data collection, thus making them impractical in automated metrics. The present study revisits the idea of assessing readability with cloze items and compares human cloze scores and readability judgments with predictions made by T5, a popular deep learning architecture, on three corpora. Across all corpora, T5 predictions significantly correlated with human cloze scores and readability judgments, and in predictive models, they could be used interchangeably with average word length, a common readability predictor. For two corpora, combining T5 and Flesch reading ease predictors improved model fit for human cloze scores and readability judgments.
Descriptors: Readability, Cloze Procedure, Scores, Prediction, Models, Readability Formulas, Correlation, Outcome Measures
Publication Type: Speeches/Meeting Papers; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: National Science Foundation (NSF); Institute of Education Sciences (ED)
Authoring Institution: N/A
Identifiers - Assessments and Surveys: Flesch Reading Ease Formula
IES Funded: Yes
Grant or Contract Numbers: 1918751; 1934745; R305A190448
Author Affiliations: N/A