NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Showing 1 to 15 of 20 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Tal Ness; Valerie J. Langlois; Albert E. Kim; Jared M. Novick – Perspectives on Psychological Science, 2025
Understanding language requires readers and listeners to cull meaning from fast-unfolding messages that often contain conflicting cues pointing to incompatible ways of interpreting the input (e.g., "The cat was chased by the mouse"). This article reviews mounting evidence from multiple methods demonstrating that cognitive control plays…
Descriptors: Cognitive Ability, Language Processing, Psycholinguistics, Cues
Qihui Xu – ProQuest LLC, 2022
How early do children produce multiword utterances? Do children's early utterances reflect abstract syntactic knowledge or are they the result of data-driven learning? We examine this issue through corpus analysis, computational modeling, and adult simulation experiments. Chapter 1 investigates when children start producing multiword utterances;…
Descriptors: Language Acquisition, Speech Communication, Computational Linguistics, Syntax
Peer reviewed Peer reviewed
Direct linkDirect link
Valentini, Alessandra; Serratrice, Ludovica – Cognitive Science, 2021
Strong correlations between vocabulary and grammar are well attested in language development in monolingual and bilingual children. What is less clear is whether there is any directionality in the relationship between the two constructs, whether it is predictive over time, and the extent to which it is affected by language input. In the present…
Descriptors: Bilingualism, Correlation, English (Second Language), Second Language Learning
Peer reviewed Peer reviewed
Direct linkDirect link
Mahowald, Kyle; Kachergis, George; Frank, Michael C. – First Language, 2020
Ambridge calls for exemplar-based accounts of language acquisition. Do modern neural networks such as transformers or word2vec -- which have been extremely successful in modern natural language processing (NLP) applications -- count? Although these models often have ample parametric complexity to store exemplars from their training data, they also…
Descriptors: Models, Language Processing, Computational Linguistics, Language Acquisition
Peer reviewed Peer reviewed
Direct linkDirect link
Chandler, Steve – First Language, 2020
Ambridge reviews and augments an impressive body of research demonstrating both the advantages and the necessity of an exemplar-based model of knowledge of one's language. He cites three computational models that have been applied successfully to issues of phonology and morphology. Focusing on Ambridge's discussion of sentence-level constructions,…
Descriptors: Models, Figurative Language, Language Processing, Language Acquisition
Jennifer Hu – ProQuest LLC, 2023
Language is one of the hallmarks of intelligence, demanding explanation in a theory of human cognition. However, language presents unique practical challenges for quantitative empirical research, making many linguistic theories difficult to test at naturalistic scales. Artificial neural network language models (LMs) provide a new tool for studying…
Descriptors: Linguistic Theory, Computational Linguistics, Models, Language Research
Nicula, Bogdan; Dascalu, Mihai; Newton, Natalie N.; Orcutt, Ellen; McNamara, Danielle S. – Grantee Submission, 2021
Learning to paraphrase supports both writing ability and reading comprehension, particularly for less skilled learners. As such, educational tools that integrate automated evaluations of paraphrases can be used to provide timely feedback to enhance learner paraphrasing skills more efficiently and effectively. Paraphrase identification is a popular…
Descriptors: Computational Linguistics, Feedback (Response), Classification, Learning Processes
Nicula, Bogdan; Dascalu, Mihai; Newton, Natalie; Orcutt, Ellen; McNamara, Danielle S. – Grantee Submission, 2021
The ability to automatically assess the quality of paraphrases can be very useful for facilitating literacy skills and providing timely feedback to learners. Our aim is twofold: a) to automatically evaluate the quality of paraphrases across four dimensions: lexical similarity, syntactic similarity, semantic similarity and paraphrase quality, and…
Descriptors: Phrase Structure, Networks, Semantics, Feedback (Response)
Peer reviewed Peer reviewed
Direct linkDirect link
Westergaard, Marit – Second Language Research, 2021
In this article, I argue that first language (L1), second language (L2) and third language (L3) acquisition are fundamentally the same process, based on learning by parsing. Both child and adult learners are sensitive to fine linguistic distinctions, and language development takes place in small steps. While the bulk of the article focuses on…
Descriptors: Multilingualism, Linguistic Input, Second Language Learning, Native Language
Peer reviewed Peer reviewed
Direct linkDirect link
Hicks, Glyn; Domínguez, Laura – Second Language Research, 2020
This article proposes a formal model of the human language faculty that accommodates the possibility of 'attrition' (modification or loss) of morphosyntactic properties in a first language. Modeling L1 grammatical attrition entails a quite fundamental paradox: if the structure of the language faculty in principle allows for attrition of…
Descriptors: Grammar, Native Language, Language Skill Attrition, Models
Peer reviewed Peer reviewed
Direct linkDirect link
Bader, Markus; Meng, Michael – Journal of Experimental Psychology: Learning, Memory, and Cognition, 2018
Most current models of sentence comprehension assume that the human parsing mechanism (HPM) algorithmically computes detailed syntactic representations as basis for extracting sentence meaning. These models share the assumption that the representations computed by the HPM accurately reflect the linguistic input. This assumption has been challenged…
Descriptors: Sentences, Misconceptions, Comprehension, Models
Lifeng Jin – ProQuest LLC, 2020
Syntactic structures are unobserved theoretical constructs which are useful in explaining a wide range of linguistic and psychological phenomena. Language acquisition studies how such latent structures are acquired by human learners through many hypothesized learning mechanisms and apparatuses, which can be genetically endowed or of general…
Descriptors: Syntax, Computational Linguistics, Learning Processes, Models
Peer reviewed Peer reviewed
Direct linkDirect link
Ouyang, Long; Boroditsky, Lera; Frank, Michael C. – Cognitive Science, 2017
Computational models have shown that purely statistical knowledge about words' linguistic contexts is sufficient to learn many properties of words, including syntactic and semantic category. For example, models can infer that "postman" and "mailman" are semantically similar because they have quantitatively similar patterns of…
Descriptors: Semiotics, Computational Linguistics, Syntax, Semantics
Peer reviewed Peer reviewed
Direct linkDirect link
Janciauskas, Marius; Chang, Franklin – Cognitive Science, 2018
Language learning requires linguistic input, but several studies have found that knowledge of second language (L2) rules does not seem to improve with more language exposure (e.g., Johnson & Newport, 1989). One reason for this is that previous studies did not factor out variation due to the different rules tested. To examine this issue, we…
Descriptors: Linguistic Input, Second Language Learning, Age Differences, Syntax
Peer reviewed Peer reviewed
Direct linkDirect link
Rissman, Lilia; Goldin-Meadow, Susan – Language Learning and Development, 2017
Across a diverse range of languages, children proceed through similar stages in their production of causal language: their initial verbs lack internal causal structure, followed by a period during which they produce causative overgeneralizations, indicating knowledge of a productive causative rule. We asked in this study whether a child not…
Descriptors: Verbs, Language Acquisition, Linguistic Input, Child Language
Previous Page | Next Page »
Pages: 1  |  2