NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 4 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Susanne Dyck; Christian Klaes – npj Science of Learning, 2025
New information that is compatible with pre-existing knowledge can be learned faster. Such schema memory effect has been reported in declarative memory and in explicit motor sequence learning (MSL). Here, we investigated if sequences of key presses that were compatible to previously trained ones, could be learned faster in an implicit MSL task.…
Descriptors: Learning Processes, Psychomotor Skills, Sequential Learning, Memory
Peer reviewed Peer reviewed
Direct linkDirect link
Benjamin M. Rottman; Yiwen Zhang – Cognitive Research: Principles and Implications, 2025
Being able to notice that a cause-effect relation is getting stronger or weaker is important for adapting to one's environment and deciding how to use the cause in the future. We conducted an experiment in which participants learned about a cause-effect relation that either got stronger or weaker over time. The experiment was conducted with a…
Descriptors: Causal Models, Memory, Learning Processes, Time
Peer reviewed Peer reviewed
Direct linkDirect link
Laura Ordonez Magro; Leonardo Pinto Arata; Joël Fagot; Jonathan Grainger; Arnaud Rey – Cognitive Science, 2025
Statistical learning allows us to implicitly create memory traces of recurring sequential patterns appearing in our environment. Here, we study the dynamics of how these sequential memory traces develop in a species of nonhuman primates (i.e., Guinea baboons, "Papio papio") that, unlike humans, cannot use language and verbal recoding…
Descriptors: Memory, Sequential Learning, Animals, Repetition
Peer reviewed Peer reviewed
Direct linkDirect link
Stephen Ferrigno; Samuel J. Cheyette; Susan Carey – Cognitive Science, 2025
Complex sequences are ubiquitous in human mental life, structuring representations within many different cognitive domains--natural language, music, mathematics, and logic, to name a few. However, the representational and computational machinery used to learn abstract grammars and process complex sequences is unknown. Here, we used an artificial…
Descriptors: Sequential Learning, Cognitive Processes, Knowledge Representation, Training