ERIC Number: EJ1484284
Record Type: Journal
Publication Date: 2025-Sep
Pages: 38
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-0364-0213
EISSN: EISSN-1551-6709
Available Date: 2025-09-15
Do Humans Use Push-Down Stacks When Learning or Producing Center-Embedded Sequences?
Stephen Ferrigno1,2; Samuel J. Cheyette3; Susan Carey2
Cognitive Science, v49 n9 e70112 2025
Complex sequences are ubiquitous in human mental life, structuring representations within many different cognitive domains--natural language, music, mathematics, and logic, to name a few. However, the representational and computational machinery used to learn abstract grammars and process complex sequences is unknown. Here, we used an artificial grammar learning task to study how adults abstract center-embedded and cross-serial grammars that generalize beyond the level of embedding of the training sequences. We tested untrained generalizations to longer sequence lengths and used error patterns, item-to-item response times, and a Bayesian mixture model to test two possible memory architectures that might underlie the sequence representations of each grammar: stacks and queues. We find that adults learned both grammars, that the cross-serial grammar was easier to learn and produce than the matched center-embedded grammar, and that item-to-item touch times during sequence generation differed systematically between the two types of sequences. Contrary to widely held assumptions, we find no evidence that a stack architecture is used to generate center-embedded sequences in an indexed A[superscript n]B[superscript n] artificial grammar. Instead, the data and modeling converged on the conclusion that both center-embedded and cross-serial sequences are generated using a queue memory architecture. In this study, participants stored items in a first-in-first-out memory architecture and then accessed them via an iterative search over the stored list to generate the matched base pairs of center-embedded or cross-serial sequences.
Descriptors: Sequential Learning, Cognitive Processes, Knowledge Representation, Training, Generalization, Error Patterns, Reaction Time, Bayesian Statistics, Memory, Learning Processes, Grammar, Artificial Intelligence
Wiley. Available from: John Wiley & Sons, Inc. 111 River Street, Hoboken, NJ 07030. Tel: 800-835-6770; e-mail: cs-journals@wiley.com; Web site: https://www-wiley-com.bibliotheek.ehb.be/en-us
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Authoring Institution: N/A
Grant or Contract Numbers: F32HD101208
Author Affiliations: 1Department of Psychology, University of Wisconsin-Madison; 2Department of Psychology, Harvard University; 3Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology

Peer reviewed
Direct link
