Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 5 |
| Since 2007 (last 20 years) | 8 |
Descriptor
| Models | 9 |
| Semantics | 9 |
| Memory | 6 |
| Cognitive Processes | 3 |
| Comparative Analysis | 3 |
| Computational Linguistics | 3 |
| Language Processing | 3 |
| Psycholinguistics | 3 |
| Associative Learning | 2 |
| Information Retrieval | 2 |
| Learning Processes | 2 |
| More ▼ | |
Source
| Grantee Submission | 4 |
| Cognitive Science | 2 |
| Cognitive Psychology | 1 |
| Journal of Memory and Language | 1 |
| Psychological Review | 1 |
Author
| Jones, Michael N. | 9 |
| Johns, Brendan T. | 4 |
| Mewhort, Douglas J. K. | 3 |
| Gruenenfelder, Thomas M. | 2 |
| Recchia, Gabriel | 2 |
| Dye, Melody | 1 |
| Hills, Thomas T. | 1 |
| Kintsch, Walter | 1 |
| Mewhort, D. J. K. | 1 |
| Rubin, Tim | 1 |
| Todd, Peter M. | 1 |
| More ▼ | |
Publication Type
| Reports - Research | 8 |
| Journal Articles | 6 |
| Information Analyses | 1 |
| Reports - Evaluative | 1 |
Education Level
| Higher Education | 1 |
Audience
Location
| Indiana | 1 |
Laws, Policies, & Programs
Assessments and Surveys
| Test of English as a Foreign… | 1 |
What Works Clearinghouse Rating
Johns, Brendan T.; Mewhort, Douglas J. K.; Jones, Michael N. – Cognitive Science, 2019
Distributional models of semantics learn word meanings from contextual co-occurrence patterns across a large sample of natural language. Early models, such as LSA and HAL (Landauer & Dumais, 1997; Lund & Burgess, 1996), counted co-occurrence events; later models, such as BEAGLE (Jones & Mewhort, 2007), replaced counting co-occurrences…
Descriptors: Semantics, Learning Processes, Models, Prediction
Johns, Brendan T.; Jones, Michael N.; Mewhort, D. J. K. – Grantee Submission, 2019
To account for natural variability in cognitive processing, it is standard practice to optimize a model's parameters by fitting it to behavioral data. Although most language-related theories acknowledge a large role for experience in language processing, variability reflecting that knowledge is usually ignored when evaluating a model's fit to…
Descriptors: Language Processing, Models, Information Sources, Linguistics
Jones, Michael N. – Grantee Submission, 2018
Abstraction is a core principle of Distributional Semantic Models (DSMs) that learn semantic representations for words by applying dimensional reduction to statistical redundancies in language. Although the posited learning mechanisms vary widely, virtually all DSMs are prototype models in that they create a single abstract representation of a…
Descriptors: Abstract Reasoning, Semantics, Memory, Learning Processes
Jones, Michael N.; Gruenenfelder, Thomas M.; Recchia, Gabriel – Grantee Submission, 2017
Recent semantic space models learn vector representations for word meanings by observing statistical redundancies across a text corpus. A word's meaning is represented as a point in a high-dimensional semantic space, and semantic similarity between words is quantified by a function of their spatial proximity (typically the cosine of the angle…
Descriptors: Semantics, Computational Linguistics, Spatial Ability, Proximity
Gruenenfelder, Thomas M.; Recchia, Gabriel; Rubin, Tim; Jones, Michael N. – Cognitive Science, 2016
We compared the ability of three different contextual models of lexical semantic memory (BEAGLE, Latent Semantic Analysis, and the Topic model) and of a simple associative model (POC) to predict the properties of semantic networks derived from word association norms. None of the semantic models were able to accurately predict all of the network…
Descriptors: Memory, Semantics, Associative Learning, Networks
Jones, Michael N.; Dye, Melody; Johns, Brendan T. – Grantee Submission, 2017
Classic accounts of lexical organization posit that humans are sensitive to environmental frequency, suggesting a mechanism for word learning based on repetition. However, a recent spate of evidence has revealed that it is not simply frequency but the diversity and distinctiveness of contexts in which a word occurs that drives lexical…
Descriptors: Word Frequency, Vocabulary Development, Context Effect, Semantics
Johns, Brendan T.; Jones, Michael N.; Mewhort, Douglas J. K. – Cognitive Psychology, 2012
We describe a computational model to explain a variety of results in both standard and false recognition. A key attribute of the model is that it uses plausible semantic representations for words, built through exposure to a linguistic corpus. A study list is encoded in the model as a gist trace, similar to the proposal of fuzzy trace theory…
Descriptors: Recognition (Psychology), Models, Semantics, Epistemology
Hills, Thomas T.; Jones, Michael N.; Todd, Peter M. – Psychological Review, 2012
Do humans search in memory using dynamic local-to-global search strategies similar to those that animals use to forage between patches in space? If so, do their dynamic memory search policies correspond to optimal foraging strategies seen for spatial foraging? Results from a number of fields suggest these possibilities, including the shared…
Descriptors: Evidence, Semantics, Memory, Search Strategies
Jones, Michael N.; Kintsch, Walter; Mewhort, Douglas J. K. – Journal of Memory and Language, 2006
A broad range of priming data has been used to explore the structure of semantic memory and to test between models of word representation. In this paper, we examine the computational mechanisms required to learn distributed semantic representations for words directly from unsupervised experience with language. To best account for the variety of…
Descriptors: Long Term Memory, Semantics, Dictionaries, Photography

Peer reviewed
Direct link
