It has traditionally been assumed that cochlear implant users de facto perform atypically in audiovisual tasks. However, a recent study that combined an auditory task with visual distractors suggests that only those cochlear implant users that are not proficient at recognizing speech sounds might show abnormal audiovisual interactions. The present study aims at reinforcing this notion by investigating the audiovisual segregation abilities of cochlear implant users in a visual task with auditory distractors. Speechreading was assessed in two groups of cochlear implant users (proficient and non-proficient at sound recognition), as well as in normal controls. A visual speech recognition task (i.e. speechreading) was administered either in silence or in combination with three types of auditory distractors: i) noise ii) reverse speech sound and iii) non-altered speech sound. Cochlear implant users proficient at speech recognition performed like normal controls in all conditions, whereas non-proficient users showed significantly different audiovisual segregation patterns in both speech conditions. These results confirm that normal-like audiovisual segregation is possible in highly skilled cochlear implant users and, consequently, that proficient and non-proficient CI users cannot be lumped into a single group. This important feature must be taken into account in further studies of audiovisual interactions in cochlear implant users.
References
[1]
Perrott DR, Saberi K, Brown K, Strybel TZ (1990) Auditory psychomotor coordination and visual search performance. Percep Psychophys 48: 214–226.
[2]
Hughes HC, Reuter-Lorenz PA, Nozawa G, Fendrich R (1994) Visual-auditory interactions in sensorimotor processing: saccades versus manual responses. J Exp Psychol Hum Percept Perform 20: 131–153.
[3]
Frens MA, Van Opstal AJ, Van der Willigen RF (1995) Spacial and temporal factors determine auditory-visual interactions in human saccadic eye movements. Percep Psychophys 57: 802–816.
[4]
Grant KW, Walden BE, Seitz PF (1998) Auditory-visual speech recognition by hearing-impaired subjects: consonant recognition, sentence recognition, and auditory-visual integration. J Acoust Soc Am 103: 2677–2690.
[5]
McDonald JJ, Teder-S?lej?rvi WA, Hillyard SA (2000) Involuntary orienting to sound improves visual perception. Nature 407: 906–908.
[6]
Teder-S?lej?rvi WA, McDonald JJ, Di Russo F, Hillyard SA (2002) An analysis of audio-visual crossmodal integration by means of event-related potential (ERP) recordings. Brain Res Cogn Brain Res 14: 106–114.
[7]
Frassinetti F, Bolognini N, Làdavas E (2002) Enhancement of visual perception by crossmodal visuo-auditory interaction. Exp Brain Res 147: 332–343.
[8]
Sekiyama K, Kanno I, Miura S, Sugita Y (2003) Auditory-visual speech perception examined by fMRI and PET. Neurosci Res 47: 277–287.
[9]
Alegria J, Lechat J (2005) Phonological processing in deaf children: when lipreading and cues are incongruent. J Deaf Stud Deaf Educ 10: 122–133.
[10]
Bolognini N, Rasi F, Làdavas E (2005) Visual localization of sounds. Neuropsychologia 43: 1655–1661.
[11]
Van Wassenhove V, Grant KW, Poeppel D (2005) Visual speech speeds up the neural processing of auditory speech. Proc Natl Acad Sci U S A 102: 1181–1186.
[12]
Rouger J, Lagleyre S, Fraysse B, Deneve S, Deguine O, et al. (2007) Evidence that cochlear-implanted deaf patients are better multisensory integrators. Proc Natl Acad Sci U S A 104: 7295–7300.
[13]
Ross LA, Saint-Amour D, Leavitt VM, Molholm S, Javitt DC, et al. (2007) Impaired multisensory processing in schizophrenia: deficits in the visual enhancement of speech comprehension under noisy environmental conditions. Schizophr Res 97: 173–183.
Schorr EA, Fox NA, van Wassenhove V, Knudsen EI (2005) Auditory-visual fusion in speech perception in children with cochlear implants. Proc Natl Acad Sci U S A 102: 18748–18750.
[16]
Schwartz JL (2010) A reanalysis of McGurk data suggests that audiovisual fusion in speech perception is subject-dependent. J Acoust Soc Am 127: 1584–1594.
[17]
Rouger J, Fraysse B, Deguine O, Barone P (2008) McGurk effects in cochlear-implanted deaf subjects. Brain Res 1188: 87–99.
[18]
Desai S, Stickney G, Zeng FG (2008) Auditory-visual speech perception in normal-hearing and cochlear-implant listeners. J Acoust Soc Am 123: 428–440.
[19]
Tremblay C, Champoux F, Lepore F, Théoret H (2010) Audiovisual fusion and cochlear implant proficiency. Restor Neurol Neurosci 28: 283–291.
[20]
Champoux F, Lepore F, Gagné JP, Théoret H (2009) Visual stimuli can impair auditory processing in cochlear implant users. Neuropsychologia 47: 17–22.
[21]
Nishimura H, Hashikawa K, Doi K, Iwaki T, Watanabe Y, et al. (1999) Sign language “heard” in the auditory cortex. Nature 397: 116.
[22]
Finney EM, Fine I, Dobkins KR (2001) Visual stimuli activate auditory cortex in the deaf. Nat Neurosci 4: 1171–1173.
[23]
Sadato N, Okada T, Honda M, Matsuki K, Yoshida M, et al. (2005) Cross-modal integration and plastic changes revealed by lip movement, random-dot motion and sign languages in the hearing and deaf. Cereb Cortex 15: 1113–1122.
[24]
Bavelier D, Dye MW, Hauser PC (2006) Do Deaf individuals see better? Trends Cogn Sci 10: 512–518.
[25]
Doucet ME, Bergeron F, Lassonde M, Ferron P, Lepore F (2006) Cross-modal reorganization and speech perception in cochlear implant users. Brain 129: L3376–3383.
[26]
Giraud AL, Truy E, Frackowiak R (2001) Imaging plasticity in cochlear implant patients. Audiol Neurootol 6: 381–393.
[27]
Smith LB, Quittner AL, Osberger MJ, Miyamoto R (1998) Audition and visual attention: the developmental trajectory in deaf and hearing populations. Dev Psy 34: 840–850.
[28]
Yucel E, Derim D, Celik D (2008) The needs of hearing impaired children's parents who attend to auditory verbal therapy-counseling program. Int J Pediatr Otorhinolaryngol 72: 1097–1011.
[29]
Horn DL, Davis RA, Pisoni DB, Miyamoto RT (2005) Development of visual attention skills in prelingually deaf children who use cochlear implants. Ear Hear 26: 389–408.