NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 13 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
van Laarhoven, Thijs; Keetels, Mirjam; Schakel, Lemmy; Vroomen, Jean – Developmental Science, 2018
Individuals with developmental dyslexia (DD) may experience, besides reading problems, other speech-related processing deficits. Here, we examined the influence of visual articulatory information (lip-read speech) at various levels of background noise on auditory word recognition in children and adults with DD. We found that children with a…
Descriptors: Dyslexia, Language Processing, Speech Communication, Sensory Integration
Peer reviewed Peer reviewed
Direct linkDirect link
Wang, Jianrong; Zhu, Yumeng; Chen, Yu; Mamat, Abdilbar; Yu, Mei; Zhang, Ju; Dang, Jianwu – Journal of Speech, Language, and Hearing Research, 2020
Purpose: The primary purpose of this study was to explore the audiovisual speech perception strategies.80.23.47 adopted by normal-hearing and deaf people in processing familiar and unfamiliar languages. Our primary hypothesis was that they would adopt different perception strategies due to different sensory experiences at an early age, limitations…
Descriptors: Eye Movements, Visual Perception, Auditory Perception, Deafness
Peer reviewed Peer reviewed
Direct linkDirect link
Woll, Bencie – Deafness and Education International, 2012
Although speechreading has always served an important role in the communication of deaf people, educational interest in speechreading has decreased in recent decades. This paper reviews speechreading in terms of speech processing, neural activity and literacy, and suggests that it has an important role in intervention programmes for all deaf…
Descriptors: Deafness, Assistive Technology, Brain, Lipreading
Peer reviewed Peer reviewed
Direct linkDirect link
Hessler, Dorte; Jonkers, Roel; Bastiaanse, Roelien – Clinical Linguistics & Phonetics, 2010
Individuals with aphasia have more problems detecting small differences between speech sounds than larger ones. This paper reports how phonemic processing is impaired and how this is influenced by speechreading. A non-word discrimination task was carried out with "audiovisual", "auditory only" and "visual only" stimulus display. Subjects had to…
Descriptors: Articulation (Speech), Phonetics, Aphasia, Task Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Capek, Cheryl M.; Woll, Bencie; MacSweeney, Mairead; Waters, Dafydd; McGuire, Philip K.; David, Anthony S.; Brammer, Michael J.; Campbell, Ruth – Brain and Language, 2010
Studies of spoken and signed language processing reliably show involvement of the posterior superior temporal cortex. This region is also reliably activated by observation of meaningless oral and manual actions. In this study we directly compared the extent to which activation in posterior superior temporal cortex is modulated by linguistic…
Descriptors: Sign Language, Deafness, Language Processing, Language Enrichment
Peer reviewed Peer reviewed
Direct linkDirect link
Davies, Rebecca; Kidd, Evan; Lander, Karen – International Journal of Language & Communication Disorders, 2009
Background: Previous research has found that newborn infants can match phonetic information in the lips and voice from as young as ten weeks old. There is evidence that access to visual speech is necessary for normal speech development. Although we have an understanding of this early sensitivity, very little research has investigated older…
Descriptors: Feedback (Response), Research Needs, Phonology, Preschool Children
Peer reviewed Peer reviewed
Direct linkDirect link
Dodd, Barbara; McIntosh, Beth; Erdener, Dogu; Burnham, Denis – Clinical Linguistics & Phonetics, 2008
An example of the auditory-visual illusion in speech perception, first described by McGurk and MacDonald, is the perception of [ta] when listeners hear [pa] in synchrony with the lip movements for [ka]. One account of the illusion is that lip-read and heard speech are combined in an articulatory code since people who mispronounce words respond…
Descriptors: Articulation (Speech), Phonology, Auditory Perception, Speech Impairments
Peer reviewed Peer reviewed
Direct linkDirect link
Smith, Elizabeth G.; Bennetto, Loisa – Journal of Child Psychology and Psychiatry, 2007
Background: During speech perception, the ability to integrate auditory and visual information causes speech to sound louder and be more intelligible, and leads to quicker processing. This integration is important in early language development, and also continues to affect speech comprehension throughout the lifespan. Previous research shows that…
Descriptors: Autism, Adolescents, Auditory Perception, Lipreading
Peer reviewed Peer reviewed
Tye-Murray, Nancy; And Others – Journal of Speech and Hearing Disorders, 1990
Five groups of subjects (n=54, ages 15-43) were assigned a repair strategy to use after misperceiving a sentence through lipreading. Strategies included asking the talker to repeat a sentence, rephrasing it, simplifying it, saying an important keyword, and speaking two sentences. Compared to controls, subjects demonstrated significantly greater…
Descriptors: Adolescents, Adults, Comprehension, Hearing Impairments
Peer reviewed Peer reviewed
Direct linkDirect link
Mohammed, Tara; Campbell, Ruth; Macsweeney, Mairead; Barry, Fiona; Coleman, Michael – Clinical Linguistics & Phonetics, 2006
Reading and speechreading are both visual skills based on speech and language processing. Here we explore individual differences in speechreading in profoundly prelingually deaf adults, hearing adults with a history of dyslexia, and hearing adults with no history of a literacy disorder. Speechreading skill distinguished the three groups: the deaf…
Descriptors: Language Skills, Language Processing, Reading Ability, Lipreading
Peer reviewed Peer reviewed
Direct linkDirect link
Brancazio,Lawrence – Journal of Experimental Psychology: Human Perception and Performance, 2004
Phoneme identification with audiovisually discrepant stimuli is influenced by information in the visual signal (the McGurk effect). Additionally, lexical status affects identification of auditorily presented phonemes. The present study tested for lexical influences on the McGurk effect. Participants identified phonemes in audiovisually discrepant…
Descriptors: Stimuli, Phonemes, Identification, Auditory Perception
Peer reviewed Peer reviewed
Direct linkDirect link
Davis, Chris; Kim, Jeesun – Cognition, 2006
The study examined whether people can extract speech related information from the talker's upper face that was presented using either normally textured videos (Experiments 1 and 3) or videos showing only the outlined of the head (Experiments 2 and 4). Experiments 1 and 2 used within- and cross-modal matching tasks. In the within-modal task,…
Descriptors: Language Processing, Auditory Perception, Inner Speech (Subvocal), Motion
Peer reviewed Peer reviewed
Studdert-Kennedy, Michael – Language and Speech, 1980
Reviews research on prosody and segmental perception, segmentation and invariance, categorical perception of speech and nonspeech, feature detectors, scaling speech sounds to an auditory-articulatory space, acoustic-phonetic dependencies within the syllable, higher order (nonphonetic) factors in the comprehension of fluent speech, and cerebral…
Descriptors: Acoustic Phonetics, Adults, Auditory Perception, Children