NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Researchers1
Location
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 22 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Quinto-Pozos, David; Renee Joyce, Taylor; Sarkar, Abhra; DiLeo, Michael; Hou, Lynn – Language Learning, 2023
The comprehension of signed language requires linguistic and visual-spatial processing, such as perspective-taking for correctly interpreting the layout of a spatial scene. However, little is known about how adult second-language (L2) learners process visual-spatial constructions in a signed language that they are studying, including which angles…
Descriptors: Second Language Learning, Spatial Ability, Visual Perception, Perspective Taking
Peer reviewed Peer reviewed
Direct linkDirect link
Lieberman, Amy M.; Fitch, Allison; Borovsky, Arielle – Developmental Science, 2022
Word learning in young children requires coordinated attention between language input and the referent object. Current accounts of word learning are based on spoken language, where the association between language and objects occurs through simultaneous and multimodal perception. In contrast, deaf children acquiring American Sign Language (ASL)…
Descriptors: Deafness, Cognitive Mapping, Cues, American Sign Language
Peer reviewed Peer reviewed
Direct linkDirect link
Secora, Kristen; Emmorey, Karen – Journal of Deaf Studies and Deaf Education, 2020
As spatial languages, sign languages rely on spatial cognitive processes that are not involved for spoken languages. Interlocutors have different visual perspectives of the signer's hands requiring a mental transformation for successful communication about spatial scenes. It is unknown whether visual-spatial perspective-taking (VSPT) or mental…
Descriptors: American Sign Language, Deafness, Hearing Impairments, Adults
Peer reviewed Peer reviewed
Direct linkDirect link
Terhune-Cotter, Brennan P.; Conway, Christopher M.; Dye, Matthew W. G. – Journal of Deaf Studies and Deaf Education, 2021
The auditory scaffolding hypothesis states that early experience with sound underpins the development of domain-general sequence processing abilities, supported by studies observing impaired sequence processing in deaf or hard-of-hearing (DHH) children. To test this hypothesis, we administered a sequence processing task to 77 DHH children who use…
Descriptors: Deafness, Hearing Impairments, Children, Preadolescents
Peer reviewed Peer reviewed
Direct linkDirect link
Vercellotti, Mary Lou – Sign Language Studies, 2022
Experience with a visual-spatial language may influence certain cognitive processes (Keehner and Gathercole 2007). Spatial ability is an important cognitive skill (Linn and Petersen 1985). Some research has found that deaf signers outperform hearing nonsigners on certain spatial tasks (e.g., Emmorey, Kosslyn, and Bellugi 1993) and that hearing…
Descriptors: American Sign Language, Second Language Learning, Second Language Instruction, Spatial Ability
Peer reviewed Peer reviewed
Direct linkDirect link
MacDonald, Kyle; LaMarr, Todd; Corina, David; Marchman, Virginia A.; Fernald, Anne – Developmental Science, 2018
When children interpret spoken language in real time, linguistic information drives rapid shifts in visual attention to objects in the visual world. This language-vision interaction can provide insights into children's developing efficiency in language comprehension. But how does language influence visual attention when the linguistic signal and…
Descriptors: Synchronous Communication, Comprehension, Toddlers, American Sign Language
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Brookshire, Geoffrey; Lu, Jenny; Nusbaum, Howard; Goldin-Meadow, Susan; Casasanto, Daniel – Grantee Submission, 2017
Despite immense variability across languages, people can learn to understand any human language, spoken or signed. What neural mechanisms allow people to comprehend language across sensory modalities? When people listen to speech, electrophysiological oscillations in auditory cortex entrain to slow (<8 Hz) fluctuations in the acoustic envelope.…
Descriptors: Brain Hemisphere Functions, Visual Perception, Auditory Perception, Learning Modalities
Peer reviewed Peer reviewed
Direct linkDirect link
Lieberman, Amy M.; Borovsky, Arielle; Hatrak, Marla; Mayberry, Rachel I. – Journal of Experimental Psychology: Learning, Memory, and Cognition, 2016
In this reply to Salverda (2016), we address a critique of the claims made in our recent study of real-time processing of American Sign Language (ASL) signs using a novel visual world eye-tracking paradigm (Lieberman, Borovsky, Hatrak, & Mayberry, 2015). Salverda asserts that our data do not support our conclusion that native signers and…
Descriptors: American Sign Language, Eye Movements, Phonology, Visual Perception
Peer reviewed Peer reviewed
Direct linkDirect link
Quinto-Pozos, David; Singleton, Jenny L.; Hauser, Peter C. – Journal of Deaf Studies and Deaf Education, 2017
This article describes the case of a deaf native signer of American Sign Language (ASL) with a specific language impairment (SLI). School records documented normal cognitive development but atypical language development. Data include school records; interviews with the child, his mother, and school professionals; ASL and English evaluations; and a…
Descriptors: Case Studies, Language Impairments, Deafness, American Sign Language
Peer reviewed Peer reviewed
Direct linkDirect link
Palmer, Stephanie Baker; Fais, Laurel; Golinkoff, Roberta Michnick; Werker, Janet F. – Child Development, 2012
Over their 1st year of life, infants' "universal" perception of the sounds of language narrows to encompass only those contrasts made in their native language (J. F. Werker & R. C. Tees, 1984). This research tested 40 infants in an eyetracking paradigm and showed that this pattern also holds for infants exposed to seen language--American Sign…
Descriptors: Infants, Language Acquisition, Perceptual Development, Auditory Perception
Peer reviewed Peer reviewed
Direct linkDirect link
Ludlow, Amanda Katherine; Heaton, Pamela; Deruelle, Christine – Journal of Cognition and Development, 2013
This study aimed to explore the recognition of emotional and non-emotional biological movements in children with severe and profound deafness. Twenty-four deaf children, together with 24 control children matched on mental age and 24 control children matched on chronological age, were asked to identify a person's actions, subjective states,…
Descriptors: Emotional Response, Motion, Deafness, Severe Disabilities
Peer reviewed Peer reviewed
Direct linkDirect link
Emmorey, Karen; Korpics, Franco; Petronio, Karen – Journal of Deaf Studies and Deaf Education, 2009
The role of visual feedback during the production of American Sign Language was investigated by comparing the size of signing space during conversations and narrative monologues for normally sighted signers, signers with tunnel vision due to Usher syndrome, and functionally blind signers. The interlocutor for all groups was a normally sighted deaf…
Descriptors: Deafness, American Sign Language, Feedback (Response), Visual Perception
Peer reviewed Peer reviewed
Direct linkDirect link
Emmorey, Karen; Bosworth, Rain; Kraljic, Tanya – Journal of Memory and Language, 2009
The perceptual loop theory of self-monitoring posits that auditory speech output is parsed by the comprehension system. For sign language, however, visual input from one's own signing is distinct from visual input received from another's signing. Two experiments investigated the role of visual feedback in the production of American Sign Language…
Descriptors: Feedback (Response), Deafness, American Sign Language, Theories
Peer reviewed Peer reviewed
Direct linkDirect link
Emmorey, Karen; Gertsberg, Nelly; Korpics, Franco; Wright, Charles E. – Applied Psycholinguistics, 2009
Speakers monitor their speech output by listening to their own voice. However, signers do not look directly at their hands and cannot see their own face. We investigated the importance of a visual perceptual loop for sign language monitoring by examining whether changes in visual input alter sign production. Deaf signers produced American Sign…
Descriptors: Deafness, Vision, American Sign Language, Feedback (Response)
Peer reviewed Peer reviewed
Direct linkDirect link
Samar, Vincent J.; Parasnis, Ila – Brain and Cognition, 2007
Studies have reported a right visual field (RVF) advantage for coherent motion detection by deaf and hearing signers but not non-signers. Yet two studies [Bosworth R. G., & Dobkins, K. R. (2002). Visual field asymmetries for motion processing in deaf and hearing signers. "Brain and Cognition," 49, 170-181; Samar, V. J., & Parasnis, I. (2005).…
Descriptors: Sign Language, Deafness, Intelligence Quotient, Motion
Previous Page | Next Page ยป
Pages: 1  |  2