Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 4 |
| Since 2017 (last 10 years) | 6 |
| Since 2007 (last 20 years) | 6 |
Descriptor
Source
| Grantee Submission | 2 |
| Cognitive Science | 1 |
| Developmental Science | 1 |
| International Journal of… | 1 |
| Language Testing | 1 |
Author
| Dascalu, Mihai | 3 |
| McNamara, Danielle S. | 2 |
| Nicula, Bogdan | 2 |
| Orcutt, Ellen | 2 |
| Botarleanu, Robert-Mihai | 1 |
| Crossley, Scott | 1 |
| Dennis, Simon | 1 |
| Fourtassi, Abdellah | 1 |
| Frank, Michael C. | 1 |
| Jiang, Hang | 1 |
| Kulkarni, Vivek | 1 |
| More ▼ | |
Publication Type
| Reports - Research | 6 |
| Journal Articles | 5 |
| Speeches/Meeting Papers | 1 |
Education Level
| Elementary Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Monteiro, Kátia; Crossley, Scott; Botarleanu, Robert-Mihai; Dascalu, Mihai – Language Testing, 2023
Lexical frequency benchmarks have been extensively used to investigate second language (L2) lexical sophistication, especially in language assessment studies. However, indices based on semantic co-occurrence, which may be a better representation of the experience language users have with lexical items, have not been sufficiently tested as…
Descriptors: Second Language Learning, Second Languages, Native Language, Semantics
Unger, Layla; Yim, Hyungwook; Savic, Olivera; Dennis, Simon; Sloutsky, Vladimir M. – Developmental Science, 2023
Recent years have seen a flourishing of Natural Language Processing models that can mimic many aspects of human language fluency. These models harness a simple, decades-old idea: It is possible to learn a lot about word meanings just from exposure to language, because words similar in meaning are used in language in similar ways. The successes of…
Descriptors: Natural Language Processing, Language Usage, Vocabulary Development, Linguistic Input
Jiang, Hang; Frank, Michael C.; Kulkarni, Vivek; Fourtassi, Abdellah – Cognitive Science, 2022
The linguistic input children receive across early childhood plays a crucial role in shaping their knowledge about the world. To study this input, researchers have begun applying distributional semantic models to large corpora of child-directed speech, extracting various patterns of word use/co-occurrence. Previous work using these models has not…
Descriptors: Caregivers, Caregiver Child Relationship, Linguistic Input, Semantics
Li Nguyen; Oliver Mayeux; Zheng Yuan – International Journal of Multilingualism, 2024
Multilingualism presents both a challenge and an opportunity for Natural Language Processing, with code-switching representing a particularly interesting problem for computational models trained on monolingual datasets. In this paper, we explore how code-switched data affects the task of Machine Translation, a task which only recently has started…
Descriptors: Code Switching (Language), Vietnamese, English (Second Language), Second Language Learning
Nicula, Bogdan; Dascalu, Mihai; Newton, Natalie N.; Orcutt, Ellen; McNamara, Danielle S. – Grantee Submission, 2021
Learning to paraphrase supports both writing ability and reading comprehension, particularly for less skilled learners. As such, educational tools that integrate automated evaluations of paraphrases can be used to provide timely feedback to enhance learner paraphrasing skills more efficiently and effectively. Paraphrase identification is a popular…
Descriptors: Computational Linguistics, Feedback (Response), Classification, Learning Processes
Nicula, Bogdan; Dascalu, Mihai; Newton, Natalie; Orcutt, Ellen; McNamara, Danielle S. – Grantee Submission, 2021
The ability to automatically assess the quality of paraphrases can be very useful for facilitating literacy skills and providing timely feedback to learners. Our aim is twofold: a) to automatically evaluate the quality of paraphrases across four dimensions: lexical similarity, syntactic similarity, semantic similarity and paraphrase quality, and…
Descriptors: Phrase Structure, Networks, Semantics, Feedback (Response)

Peer reviewed
Direct link
