NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 7 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Laura Ordonez Magro; Leonardo Pinto Arata; Joël Fagot; Jonathan Grainger; Arnaud Rey – Cognitive Science, 2025
Statistical learning allows us to implicitly create memory traces of recurring sequential patterns appearing in our environment. Here, we study the dynamics of how these sequential memory traces develop in a species of nonhuman primates (i.e., Guinea baboons, "Papio papio") that, unlike humans, cannot use language and verbal recoding…
Descriptors: Memory, Sequential Learning, Animals, Repetition
Peer reviewed Peer reviewed
Direct linkDirect link
Maurício D. Martins; Zoe Bergmann; Elena Leonova; Roberta Bianco; Daniela Sammler; Arno Villringer – Cognitive Science, 2025
Recursive hierarchical embedding allows humans to generate multiple hierarchical levels using simple rules. We can acquire recursion from exposure to linguistic and visual examples, but only develop the ability to understand "multiple-level" structures like "[[second] red] ball]" after mastering "same-level"…
Descriptors: Psychomotor Skills, Adults, Adult Learning, Learning Processes
Peer reviewed Peer reviewed
Direct linkDirect link
Schmid, Samuel; Saddy, Douglas; Franck, Julie – Cognitive Science, 2023
In this article, we explore the extraction of recursive nested structure in the processing of binary sequences. Our aim was to determine whether humans learn the higher-order regularities of a highly simplified input where only sequential-order information marks the hierarchical structure. To this end, we implemented a sequence generated by the…
Descriptors: Learning Processes, Sequential Learning, Grammar, Language Processing
Peer reviewed Peer reviewed
Direct linkDirect link
Emerson, Samantha N.; Conway, Christopher M. – Cognitive Science, 2023
There are two main approaches to how statistical patterns are extracted from sequences: The transitional probability approach proposes that statistical learning occurs through the computation of probabilities between items in a sequence. The chunking approach, including models such as PARSER and TRACX, proposes that units are extracted as chunks.…
Descriptors: Statistics Education, Learning Processes, Learning Theories, Pattern Recognition
Peer reviewed Peer reviewed
Direct linkDirect link
Stephen Ferrigno; Samuel J. Cheyette; Susan Carey – Cognitive Science, 2025
Complex sequences are ubiquitous in human mental life, structuring representations within many different cognitive domains--natural language, music, mathematics, and logic, to name a few. However, the representational and computational machinery used to learn abstract grammars and process complex sequences is unknown. Here, we used an artificial…
Descriptors: Sequential Learning, Cognitive Processes, Knowledge Representation, Training
Peer reviewed Peer reviewed
Direct linkDirect link
Fabian Tomaschek; Michael Ramscar; Jessie S. Nixon – Cognitive Science, 2024
Sequence learning is fundamental to a wide range of cognitive functions. Explaining how sequences--and the relations between the elements they comprise--are learned is a fundamental challenge to cognitive science. However, although hundreds of articles addressing this question are published each year, the actual learning mechanisms involved in the…
Descriptors: Sequential Learning, Learning Processes, Serial Learning, Executive Function
Peer reviewed Peer reviewed
Direct linkDirect link
Lu, Hongjing; Rojas, Randall R.; Beckers, Tom; Yuille, Alan L. – Cognitive Science, 2016
Two key research issues in the field of causal learning are how people acquire causal knowledge when observing data that are presented sequentially, and the level of abstraction at which learning takes place. Does sequential causal learning solely involve the acquisition of specific cause-effect links, or do learners also acquire knowledge about…
Descriptors: Learning Processes, Causal Models, Sequential Learning, Abstract Reasoning