NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 7 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Clifton Pye – First Language, 2024
The Mayan language Mam uses complex predicates to express events. Complex predicates map multiple semantic elements onto a single word, and consequently have a blend of lexical and phrasal features. The chameleon-like nature of complex predicates provides a window on children's ability to express phrasal combinations at the one-word stage of…
Descriptors: Intonation, Suprasegmentals, American Indian Languages, Vowels
Peer reviewed Peer reviewed
Direct linkDirect link
Shang Jiang; Anna Siyanova-Chanturia – First Language, 2024
Recent studies have accumulated to suggest that children, akin to adults, exhibit a processing advantage for formulaic language (e.g. "save energy") over novel language (e.g. "sell energy"), as well as sensitivity to phrase frequencies. The majority of these studies are based on formulaic sequences in their canonical form. In…
Descriptors: Phrase Structure, Language Processing, Language Acquisition, Child Language
Peer reviewed Peer reviewed
Direct linkDirect link
Jiao Du; Xiaowei He; Haopeng Yu – First Language, 2025
We used the elicited production task to explore the production of short and long passives in 15 Mandarin-speaking preschool children with Developmental Language Disorder (DLD; aged 4;2-5;11) in comparison with 15 Typically Developing Aged-matched (TDA) children (aged 4;3-5;8) and 15 Typically Developing Younger (TDY) children (aged 3;2-4;3). This…
Descriptors: Mandarin Chinese, Form Classes (Languages), Child Language, Language Impairments
Peer reviewed Peer reviewed
Direct linkDirect link
Hartshorne, Joshua K. – First Language, 2020
Ambridge argues that the existence of exemplar models for individual phenomena (words, inflection rules, etc.) suggests the feasibility of a unified, exemplars-everywhere model that eschews abstraction. The argument would be strengthened by a description of such a model. However, none is provided. I show that any attempt to do so would immediately…
Descriptors: Models, Language Acquisition, Language Processing, Bayesian Statistics
Morton, Ian; Schuele, C. Melanie – First Language, 2021
Preschoolers' earliest productions of sentential complement sentences have matrix clauses that are limited in form. Diessel proposed that matrix clauses in these early productions are propositionally empty fixed phrases that lack semantic and syntactic integration with the clausal complement. By 4 years of age, however, preschoolers produce…
Descriptors: Phrase Structure, Preschool Children, Semantics, Syntax
Peer reviewed Peer reviewed
Direct linkDirect link
Schuler, Kathryn D.; Kodner, Jordan; Caplan, Spencer – First Language, 2020
In 'Against Stored Abstractions,' Ambridge uses neural and computational evidence to make his case against abstract representations. He argues that storing only exemplars is more parsimonious -- why bother with abstraction when exemplar models with on-the-fly calculation can do everything abstracting models can and more -- and implies that his…
Descriptors: Language Processing, Language Acquisition, Computational Linguistics, Linguistic Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Knabe, Melina L.; Vlach, Haley A. – First Language, 2020
Ambridge argues that there is widespread agreement among child language researchers that learners store linguistic abstractions. In this commentary the authors first argue that this assumption is incorrect; anti-representationalist/exemplar views are pervasive in theories of child language. Next, the authors outline what has been learned from this…
Descriptors: Child Language, Children, Language Acquisition, Models