Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 3 |
| Since 2017 (last 10 years) | 8 |
| Since 2007 (last 20 years) | 13 |
Descriptor
| American Sign Language | 18 |
| Visual Perception | 18 |
| Deafness | 15 |
| Auditory Perception | 5 |
| Cognitive Processes | 5 |
| Comprehension | 5 |
| Language Processing | 5 |
| Visual Stimuli | 5 |
| Correlation | 4 |
| Spatial Ability | 4 |
| Comparative Analysis | 3 |
| More ▼ | |
Source
Author
Publication Type
| Reports - Research | 18 |
| Journal Articles | 16 |
| Collected Works - Proceedings | 1 |
Education Level
| Higher Education | 1 |
| Postsecondary Education | 1 |
Audience
| Researchers | 1 |
Location
Laws, Policies, & Programs
Assessments and Surveys
| Differential Aptitude Test | 1 |
| Raven Progressive Matrices | 1 |
What Works Clearinghouse Rating
Quinto-Pozos, David; Renee Joyce, Taylor; Sarkar, Abhra; DiLeo, Michael; Hou, Lynn – Language Learning, 2023
The comprehension of signed language requires linguistic and visual-spatial processing, such as perspective-taking for correctly interpreting the layout of a spatial scene. However, little is known about how adult second-language (L2) learners process visual-spatial constructions in a signed language that they are studying, including which angles…
Descriptors: Second Language Learning, Spatial Ability, Visual Perception, Perspective Taking
Lieberman, Amy M.; Fitch, Allison; Borovsky, Arielle – Developmental Science, 2022
Word learning in young children requires coordinated attention between language input and the referent object. Current accounts of word learning are based on spoken language, where the association between language and objects occurs through simultaneous and multimodal perception. In contrast, deaf children acquiring American Sign Language (ASL)…
Descriptors: Deafness, Cognitive Mapping, Cues, American Sign Language
Secora, Kristen; Emmorey, Karen – Journal of Deaf Studies and Deaf Education, 2020
As spatial languages, sign languages rely on spatial cognitive processes that are not involved for spoken languages. Interlocutors have different visual perspectives of the signer's hands requiring a mental transformation for successful communication about spatial scenes. It is unknown whether visual-spatial perspective-taking (VSPT) or mental…
Descriptors: American Sign Language, Deafness, Hearing Impairments, Adults
Terhune-Cotter, Brennan P.; Conway, Christopher M.; Dye, Matthew W. G. – Journal of Deaf Studies and Deaf Education, 2021
The auditory scaffolding hypothesis states that early experience with sound underpins the development of domain-general sequence processing abilities, supported by studies observing impaired sequence processing in deaf or hard-of-hearing (DHH) children. To test this hypothesis, we administered a sequence processing task to 77 DHH children who use…
Descriptors: Deafness, Hearing Impairments, Children, Preadolescents
Vercellotti, Mary Lou – Sign Language Studies, 2022
Experience with a visual-spatial language may influence certain cognitive processes (Keehner and Gathercole 2007). Spatial ability is an important cognitive skill (Linn and Petersen 1985). Some research has found that deaf signers outperform hearing nonsigners on certain spatial tasks (e.g., Emmorey, Kosslyn, and Bellugi 1993) and that hearing…
Descriptors: American Sign Language, Second Language Learning, Second Language Instruction, Spatial Ability
MacDonald, Kyle; LaMarr, Todd; Corina, David; Marchman, Virginia A.; Fernald, Anne – Developmental Science, 2018
When children interpret spoken language in real time, linguistic information drives rapid shifts in visual attention to objects in the visual world. This language-vision interaction can provide insights into children's developing efficiency in language comprehension. But how does language influence visual attention when the linguistic signal and…
Descriptors: Synchronous Communication, Comprehension, Toddlers, American Sign Language
Brookshire, Geoffrey; Lu, Jenny; Nusbaum, Howard; Goldin-Meadow, Susan; Casasanto, Daniel – Grantee Submission, 2017
Despite immense variability across languages, people can learn to understand any human language, spoken or signed. What neural mechanisms allow people to comprehend language across sensory modalities? When people listen to speech, electrophysiological oscillations in auditory cortex entrain to slow (<8 Hz) fluctuations in the acoustic envelope.…
Descriptors: Brain Hemisphere Functions, Visual Perception, Auditory Perception, Learning Modalities
Quinto-Pozos, David; Singleton, Jenny L.; Hauser, Peter C. – Journal of Deaf Studies and Deaf Education, 2017
This article describes the case of a deaf native signer of American Sign Language (ASL) with a specific language impairment (SLI). School records documented normal cognitive development but atypical language development. Data include school records; interviews with the child, his mother, and school professionals; ASL and English evaluations; and a…
Descriptors: Case Studies, Language Impairments, Deafness, American Sign Language
Palmer, Stephanie Baker; Fais, Laurel; Golinkoff, Roberta Michnick; Werker, Janet F. – Child Development, 2012
Over their 1st year of life, infants' "universal" perception of the sounds of language narrows to encompass only those contrasts made in their native language (J. F. Werker & R. C. Tees, 1984). This research tested 40 infants in an eyetracking paradigm and showed that this pattern also holds for infants exposed to seen language--American Sign…
Descriptors: Infants, Language Acquisition, Perceptual Development, Auditory Perception
Ludlow, Amanda Katherine; Heaton, Pamela; Deruelle, Christine – Journal of Cognition and Development, 2013
This study aimed to explore the recognition of emotional and non-emotional biological movements in children with severe and profound deafness. Twenty-four deaf children, together with 24 control children matched on mental age and 24 control children matched on chronological age, were asked to identify a person's actions, subjective states,…
Descriptors: Emotional Response, Motion, Deafness, Severe Disabilities
Emmorey, Karen; Bosworth, Rain; Kraljic, Tanya – Journal of Memory and Language, 2009
The perceptual loop theory of self-monitoring posits that auditory speech output is parsed by the comprehension system. For sign language, however, visual input from one's own signing is distinct from visual input received from another's signing. Two experiments investigated the role of visual feedback in the production of American Sign Language…
Descriptors: Feedback (Response), Deafness, American Sign Language, Theories
Emmorey, Karen; Gertsberg, Nelly; Korpics, Franco; Wright, Charles E. – Applied Psycholinguistics, 2009
Speakers monitor their speech output by listening to their own voice. However, signers do not look directly at their hands and cannot see their own face. We investigated the importance of a visual perceptual loop for sign language monitoring by examining whether changes in visual input alter sign production. Deaf signers produced American Sign…
Descriptors: Deafness, Vision, American Sign Language, Feedback (Response)
Peer reviewedEmmorey, Karen; McCullough, Stephen; Brentari, Diane – Language and Cognitive Processes, 2003
Two experiments examined whether Deaf signers or hearing nonsigners exhibit categorical perception (CP) for hand configuration or for place of articulation in American Sign Language. Findings that signers and nonsigners performed similarly suggests that these categories in American Sign Language have a perceptual as well as a linguistic basis.…
Descriptors: American Sign Language, Classification, Cognitive Processes, Deafness
Samar, Vincent J.; Parasnis, Ila – Brain and Cognition, 2007
Studies have reported a right visual field (RVF) advantage for coherent motion detection by deaf and hearing signers but not non-signers. Yet two studies [Bosworth R. G., & Dobkins, K. R. (2002). Visual field asymmetries for motion processing in deaf and hearing signers. "Brain and Cognition," 49, 170-181; Samar, V. J., & Parasnis, I. (2005).…
Descriptors: Sign Language, Deafness, Intelligence Quotient, Motion
Peer reviewedPoizner, Howard; And Others – Language Sciences, 1989
Investigates the psychological representation of visual-gestural languages from a cross-linguistic perspective. The perception of signers of American and Chinese Sign Languages is analyzed. (27 references) (Author/VWL)
Descriptors: American Sign Language, Chinese, Comparative Analysis, Deafness
Previous Page | Next Page ยป
Pages: 1 | 2
Direct link
