NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Researchers1
Location
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 18 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Lieberman, Amy M.; Fitch, Allison; Borovsky, Arielle – Developmental Science, 2022
Word learning in young children requires coordinated attention between language input and the referent object. Current accounts of word learning are based on spoken language, where the association between language and objects occurs through simultaneous and multimodal perception. In contrast, deaf children acquiring American Sign Language (ASL)…
Descriptors: Deafness, Cognitive Mapping, Cues, American Sign Language
Peer reviewed Peer reviewed
Direct linkDirect link
Secora, Kristen; Emmorey, Karen – Journal of Deaf Studies and Deaf Education, 2020
As spatial languages, sign languages rely on spatial cognitive processes that are not involved for spoken languages. Interlocutors have different visual perspectives of the signer's hands requiring a mental transformation for successful communication about spatial scenes. It is unknown whether visual-spatial perspective-taking (VSPT) or mental…
Descriptors: American Sign Language, Deafness, Hearing Impairments, Adults
Peer reviewed Peer reviewed
Direct linkDirect link
Terhune-Cotter, Brennan P.; Conway, Christopher M.; Dye, Matthew W. G. – Journal of Deaf Studies and Deaf Education, 2021
The auditory scaffolding hypothesis states that early experience with sound underpins the development of domain-general sequence processing abilities, supported by studies observing impaired sequence processing in deaf or hard-of-hearing (DHH) children. To test this hypothesis, we administered a sequence processing task to 77 DHH children who use…
Descriptors: Deafness, Hearing Impairments, Children, Preadolescents
Peer reviewed Peer reviewed
Direct linkDirect link
Vercellotti, Mary Lou – Sign Language Studies, 2022
Experience with a visual-spatial language may influence certain cognitive processes (Keehner and Gathercole 2007). Spatial ability is an important cognitive skill (Linn and Petersen 1985). Some research has found that deaf signers outperform hearing nonsigners on certain spatial tasks (e.g., Emmorey, Kosslyn, and Bellugi 1993) and that hearing…
Descriptors: American Sign Language, Second Language Learning, Second Language Instruction, Spatial Ability
Peer reviewed Peer reviewed
Direct linkDirect link
MacDonald, Kyle; LaMarr, Todd; Corina, David; Marchman, Virginia A.; Fernald, Anne – Developmental Science, 2018
When children interpret spoken language in real time, linguistic information drives rapid shifts in visual attention to objects in the visual world. This language-vision interaction can provide insights into children's developing efficiency in language comprehension. But how does language influence visual attention when the linguistic signal and…
Descriptors: Synchronous Communication, Comprehension, Toddlers, American Sign Language
Peer reviewed Peer reviewed
Direct linkDirect link
Lieberman, Amy M.; Borovsky, Arielle; Hatrak, Marla; Mayberry, Rachel I. – Journal of Experimental Psychology: Learning, Memory, and Cognition, 2016
In this reply to Salverda (2016), we address a critique of the claims made in our recent study of real-time processing of American Sign Language (ASL) signs using a novel visual world eye-tracking paradigm (Lieberman, Borovsky, Hatrak, & Mayberry, 2015). Salverda asserts that our data do not support our conclusion that native signers and…
Descriptors: American Sign Language, Eye Movements, Phonology, Visual Perception
Peer reviewed Peer reviewed
Direct linkDirect link
Quinto-Pozos, David; Singleton, Jenny L.; Hauser, Peter C. – Journal of Deaf Studies and Deaf Education, 2017
This article describes the case of a deaf native signer of American Sign Language (ASL) with a specific language impairment (SLI). School records documented normal cognitive development but atypical language development. Data include school records; interviews with the child, his mother, and school professionals; ASL and English evaluations; and a…
Descriptors: Case Studies, Language Impairments, Deafness, American Sign Language
Peer reviewed Peer reviewed
Direct linkDirect link
Ludlow, Amanda Katherine; Heaton, Pamela; Deruelle, Christine – Journal of Cognition and Development, 2013
This study aimed to explore the recognition of emotional and non-emotional biological movements in children with severe and profound deafness. Twenty-four deaf children, together with 24 control children matched on mental age and 24 control children matched on chronological age, were asked to identify a person's actions, subjective states,…
Descriptors: Emotional Response, Motion, Deafness, Severe Disabilities
Peer reviewed Peer reviewed
Direct linkDirect link
Emmorey, Karen; Korpics, Franco; Petronio, Karen – Journal of Deaf Studies and Deaf Education, 2009
The role of visual feedback during the production of American Sign Language was investigated by comparing the size of signing space during conversations and narrative monologues for normally sighted signers, signers with tunnel vision due to Usher syndrome, and functionally blind signers. The interlocutor for all groups was a normally sighted deaf…
Descriptors: Deafness, American Sign Language, Feedback (Response), Visual Perception
Peer reviewed Peer reviewed
Direct linkDirect link
Emmorey, Karen; Bosworth, Rain; Kraljic, Tanya – Journal of Memory and Language, 2009
The perceptual loop theory of self-monitoring posits that auditory speech output is parsed by the comprehension system. For sign language, however, visual input from one's own signing is distinct from visual input received from another's signing. Two experiments investigated the role of visual feedback in the production of American Sign Language…
Descriptors: Feedback (Response), Deafness, American Sign Language, Theories
Peer reviewed Peer reviewed
Direct linkDirect link
Emmorey, Karen; Gertsberg, Nelly; Korpics, Franco; Wright, Charles E. – Applied Psycholinguistics, 2009
Speakers monitor their speech output by listening to their own voice. However, signers do not look directly at their hands and cannot see their own face. We investigated the importance of a visual perceptual loop for sign language monitoring by examining whether changes in visual input alter sign production. Deaf signers produced American Sign…
Descriptors: Deafness, Vision, American Sign Language, Feedback (Response)
Peer reviewed Peer reviewed
Emmorey, Karen; McCullough, Stephen; Brentari, Diane – Language and Cognitive Processes, 2003
Two experiments examined whether Deaf signers or hearing nonsigners exhibit categorical perception (CP) for hand configuration or for place of articulation in American Sign Language. Findings that signers and nonsigners performed similarly suggests that these categories in American Sign Language have a perceptual as well as a linguistic basis.…
Descriptors: American Sign Language, Classification, Cognitive Processes, Deafness
Peer reviewed Peer reviewed
Direct linkDirect link
Samar, Vincent J.; Parasnis, Ila – Brain and Cognition, 2007
Studies have reported a right visual field (RVF) advantage for coherent motion detection by deaf and hearing signers but not non-signers. Yet two studies [Bosworth R. G., & Dobkins, K. R. (2002). Visual field asymmetries for motion processing in deaf and hearing signers. "Brain and Cognition," 49, 170-181; Samar, V. J., & Parasnis, I. (2005).…
Descriptors: Sign Language, Deafness, Intelligence Quotient, Motion
Peer reviewed Peer reviewed
Poizner, Howard; And Others – Language Sciences, 1989
Investigates the psychological representation of visual-gestural languages from a cross-linguistic perspective. The perception of signers of American and Chinese Sign Languages is analyzed. (27 references) (Author/VWL)
Descriptors: American Sign Language, Chinese, Comparative Analysis, Deafness
Peer reviewed Peer reviewed
Fischer, Susan D.; Delhorne, Lorraine A.; Reed, Charlotte M. – Journal of Speech, Language, and Hearing Research, 1999
Videotaped productions of isolated American Sign Language signs or sentences were presented at speeds of two to six times normal. Results indicated a breakdown in intelligibility at around 2.5 to 3 times the normal rate. Results are similar to those found for auditory reception of time-compressed speech suggesting a modality-independent limit to…
Descriptors: American Sign Language, Auditory Perception, Deafness, Language Processing
Previous Page | Next Page ยป
Pages: 1  |  2