Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 3 |
| Since 2007 (last 20 years) | 12 |
Descriptor
| Acoustics | 13 |
| Cues | 13 |
| Visual Perception | 13 |
| Auditory Perception | 8 |
| Visual Stimuli | 5 |
| Cognitive Processes | 4 |
| Foreign Countries | 4 |
| Speech | 4 |
| Accuracy | 3 |
| Assistive Technology | 3 |
| Statistical Analysis | 3 |
| More ▼ | |
Source
Author
Publication Type
| Journal Articles | 13 |
| Reports - Research | 11 |
| Reports - Descriptive | 1 |
| Reports - Evaluative | 1 |
Education Level
| Higher Education | 2 |
Audience
| Teachers | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Shinohara, Yasuaki – Journal of Speech, Language, and Hearing Research, 2021
Purpose: This study tested the hypothesis that audiovisual training benefits children more than it does adults and that it improves Japanese-speaking children's English /r/-/l/ perception to a native-like level. Method: Ten sessions of audiovisual English /r/-/l/ identification training were conducted for Japanese-speaking adults and children.…
Descriptors: Japanese, English (Second Language), Second Language Learning, Training
Peters, Benjamin; Rahm, Benjamin; Czoschke, Stefan; Barnes, Catherine; Kaiser, Jochen; Bledowski, Christoph – Journal of Experimental Psychology: Learning, Memory, and Cognition, 2018
Working memory (WM) enables a rapid access to a limited number of items that are no longer physically present. WM studies usually involve the encoding and retention of multiple items, while probing a single item only. Hence, little is known about how well multiple items can be reported from WM. Here we asked participants to successively report…
Descriptors: Short Term Memory, Visual Perception, Recall (Psychology), Cues
Dorman, Michael F.; Liss, Julie; Wang, Shuai; Berisha, Visar; Ludwig, Cimarron; Natale, Sarah Cook – Journal of Speech, Language, and Hearing Research, 2016
Purpose: Five experiments probed auditory-visual (AV) understanding of sentences by users of cochlear implants (CIs). Method: Sentence material was presented in auditory (A), visual (V), and AV test conditions to listeners with normal hearing and CI users. Results: (a) Most CI users report that most of the time, they have access to both A and V…
Descriptors: Sentences, Assistive Technology, Syllables, Phonemes
Knowland, Victoria C. P.; Evans, Sam; Snell, Caroline; Rosen, Stuart – Journal of Speech, Language, and Hearing Research, 2016
Purpose: The purpose of the study was to assess the ability of children with developmental language learning impairments (LLIs) to use visual speech cues from the talking face. Method: In this cross-sectional study, 41 typically developing children (mean age: 8 years 0 months, range: 4 years 5 months to 11 years 10 months) and 27 children with…
Descriptors: Children, Language Impairments, Visual Perception, Speech
Davenport, Carrie A.; Alber-Morgan, Sheila R. – TEACHING Exceptional Children, 2016
It is imperative that teachers have the knowledge and resources to support children who are deaf and use a cochlear implant in general education classrooms. The recommendations presented in this article provide teachers with the information necessary to promote a child's academic progress, communication needs, and social development. In order to…
Descriptors: Preschool Children, Deafness, Hearing Impairments, Assistive Technology
Mitchell, Helen F. – Music Education Research, 2018
The music industry is built on a system of expert evaluation focused on sound, but the foundations are challenged by recent research, which suggests that sight trumps sound. This presents a challenge to music educators, who train the next generation of expert performers and listeners. The aim of this study is to investigate students' perceptions…
Descriptors: Music Education, Experiential Learning, Evaluation Criteria, Music Activities
Megnin-Viggars, Odette; Goswami, Usha – Brain and Language, 2013
Visual speech inputs can enhance auditory speech information, particularly in noisy or degraded conditions. The natural statistics of audiovisual speech highlight the temporal correspondence between visual and auditory prosody, with lip, jaw, cheek and head movements conveying information about the speech envelope. Low-frequency spatial and…
Descriptors: Phonology, Cues, Visual Perception, Speech
Keetels, Mirjam; Vroomen, Jean – Journal of Experimental Psychology: Human Perception and Performance, 2011
The authors examined the effects of a task-irrelevant sound on visual processing. Participants were presented with revolving clocks at or around central fixation and reported the hand position of a target clock at the time an exogenous cue (1 clock turning red) or an endogenous cue (a line pointing toward 1 of the clocks) was presented. A…
Descriptors: Cues, Visual Perception, Cognitive Processes, Acoustics
Most, Tova; Aviner, Chen – Journal of Deaf Studies and Deaf Education, 2009
This study evaluated the benefits of cochlear implant (CI) with regard to emotion perception of participants differing in their age of implantation, in comparison to hearing aid users and adolescents with normal hearing (NH). Emotion perception was examined by having the participants identify happiness, anger, surprise, sadness, fear, and disgust.…
Descriptors: Cues, Hearing Impairments, Visual Perception, Assistive Technology
Zupan, Barbra; Neumann, Dawn; Babbage, Duncan R.; Willer, Barry – Journal of Communication Disorders, 2009
Persons with traumatic brain injury (TBI) often have difficulty recognizing emotion in others. This is likely due to difficulties in interpreting non-verbal cues of affect. Although deficits in interpreting facial cues of affect are being widely explored, interpretation of vocal cues of affect has received much less attention. Accurate…
Descriptors: Cues, Nonverbal Communication, Injuries, Identification
Buchwald, Adam B.; Winters, Stephen J.; Pisoni, David B. – Language and Cognitive Processes, 2009
Visual speech perception has become a topic of considerable interest to speech researchers. Previous research has demonstrated that perceivers neurally encode and use speech information from the visual modality, and this information has been found to facilitate spoken word recognition in tasks such as lexical decision (Kim, Davis, & Krins,…
Descriptors: Auditory Perception, Word Recognition, Cognitive Processes, Cues
Krahmer, Emiel; Swerts, Marc – Journal of Memory and Language, 2007
Speakers employ acoustic cues (pitch accents) to indicate that a word is important, but may also use visual cues (beat gestures, head nods, eyebrow movements) for this purpose. Even though these acoustic and visual cues are related, the exact nature of this relationship is far from well understood. We investigate whether producing a visual beat…
Descriptors: Cues, Visual Perception, Auditory Perception, Acoustics
Schwartz, Jean-Luc; Berthommier, Frederic; Savariaux, Christophe – Cognition, 2004
Lip reading is the ability to partially understand speech by looking at the speaker's lips. It improves the intelligibility of speech in noise when audio-visual perception is compared with audio-only perception. A recent set of experiments showed that seeing the speaker's lips also enhances "sensitivity" to acoustic information,…
Descriptors: Hearing (Physiology), Lipreading, Auditory Perception, Visual Perception

Peer reviewed
Direct link
