Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 1 |
| Since 2017 (last 10 years) | 3 |
| Since 2007 (last 20 years) | 5 |
Descriptor
| Associative Learning | 5 |
| Comparative Analysis | 5 |
| Networks | 5 |
| Models | 4 |
| Semantics | 4 |
| Memory | 3 |
| Cognitive Processes | 2 |
| Computational Linguistics | 2 |
| Prediction | 2 |
| Task Analysis | 2 |
| Accuracy | 1 |
| More ▼ | |
Author
| Gruenenfelder, Thomas M. | 2 |
| Jones, Michael N. | 2 |
| Recchia, Gabriel | 2 |
| Balota, David A. | 1 |
| Hamrick, Phillip | 1 |
| Kumar, Abhilasha A. | 1 |
| Rubin, Tim | 1 |
| Savic, Olivera | 1 |
| Sloutsky, Vladimir M. | 1 |
| Steyvers, Mark | 1 |
| Unger, Layla | 1 |
| More ▼ | |
Publication Type
| Reports - Research | 5 |
| Journal Articles | 4 |
Education Level
| Higher Education | 2 |
| Postsecondary Education | 2 |
Audience
Location
| Missouri (Saint Louis) | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Kumar, Abhilasha A.; Balota, David A.; Steyvers, Mark – Journal of Experimental Psychology: Learning, Memory, and Cognition, 2020
We examined 3 different network models of representing semantic knowledge (5,018-word directed and undirected step distance networks, and an association-correlation network) to predict lexical priming effects. In Experiment 1, participants made semantic relatedness judgments for word pairs with varying path lengths. Response latencies for…
Descriptors: Semantics, Networks, Correlation, Semitic Languages
Savic, Olivera; Unger, Layla; Sloutsky, Vladimir M. – Journal of Experimental Psychology: Learning, Memory, and Cognition, 2022
Human word learning is remarkable: We not only learn thousands of words but also form organized semantic networks in which words are interconnected according to meaningful links, such as those between "apple," "juicy," and "pear." These links play key roles in our abilities to use language. How do words become…
Descriptors: Semantics, Vocabulary Development, Language Usage, Eye Movements
Jones, Michael N.; Gruenenfelder, Thomas M.; Recchia, Gabriel – Grantee Submission, 2017
Recent semantic space models learn vector representations for word meanings by observing statistical redundancies across a text corpus. A word's meaning is represented as a point in a high-dimensional semantic space, and semantic similarity between words is quantified by a function of their spatial proximity (typically the cosine of the angle…
Descriptors: Semantics, Computational Linguistics, Spatial Ability, Proximity
Gruenenfelder, Thomas M.; Recchia, Gabriel; Rubin, Tim; Jones, Michael N. – Cognitive Science, 2016
We compared the ability of three different contextual models of lexical semantic memory (BEAGLE, Latent Semantic Analysis, and the Topic model) and of a simple associative model (POC) to predict the properties of semantic networks derived from word association norms. None of the semantic models were able to accurately predict all of the network…
Descriptors: Memory, Semantics, Associative Learning, Networks
Hamrick, Phillip – Language Learning, 2014
Humans are remarkably sensitive to the statistical structure of language. However, different mechanisms have been proposed to account for such statistical sensitivities. The present study compared adult learning of syntax and the ability of two models of statistical learning to simulate human performance: Simple Recurrent Networks, which learn by…
Descriptors: Second Language Learning, Role, Syntax, Computational Linguistics

Peer reviewed
Direct link
