| old_uid | 9075 |
|---|
| title | How auditory information influences binocular rivalry:
Revisiting the McGurk effect |
|---|
| start_date | 2010/09/21 |
|---|
| schedule | 14h |
|---|
| online | no |
|---|
| summary | Although multistable perception has long been studied, in recent years, paradigms involving ambiguous visual stimuli have been used to assess phenomenal awareness. Indeed, for such stimuli, although the sensory input is invariant, a perceptual decision occurs bringing to awareness only one of the possible interpretations. Moreover, the observer experiences inevitable oscillations between these percepts. Only few studies have tackled bistability for multimodal percepts. Hupé et al. [1] used spatially located audio-visual beeps and failed in finding synchronized auditory and visual perceptual decisions. Van Ee et al. [2] investigated how sensory congruency could influence volitional control in binocular rivalry using looming motion stimuli that could be seen, heard or sensed on the hand. They found that attention to the additional modality (sound or touch) has to be engaged to promote visual dominance and that only temporal congruency is required in the end. Nonetheless, we believe that in both studies such low-level stimuli could have reduced the possibility of finding passive interactions and limited the congruency to temporal features, which motivated the use of higher-level audio-visual verbal processing in our experiments. In a clever dynamic version of the classical face/vase illusion, Munhall et al. [3] found no interaction between the suppressed percept (the talking faces) and sound. Based on known differences between ambiguous figures and rivalry, we wondered if the use of rivaling talking faces could increase the sensory integrations with the suppressed percept. Accordingly, we developed a series of experiments relying on the McGurk effect – known to involve robust audio-visual integration – in order to investigate multimodal rivalry. After validating the bistability properties of rivaling videos (lips pronouncing /aba/ vs. /aga/), we added the sound /aba/ in order to assess the influence of audio-visual integration in the perceptual decision process. The results evidence, on the one hand, possible multimodal integration with the suppressed visual stimulus and, on the other hand, the influence of consistent auditory stimulation in the dominance mechanism of visual rivalry. These findings suggest that at higher-level processing stages, auditory cues do interact with the perceptual decision involved in binocular rivalry. This research project will eventually shed light on the locus where multimodal oscillations might occur, feeding the historical debate of whether perceptual decisions take place in early processing stages or can be top-down driven. References: 1. Hupe, J., Joffo, L. & Pressnitzer, D. Bistability for audiovisual stimuli: Perceptual decision is modality specific. J.Vis. 8, 1-15 (2008) 2. van Ee, R., van Boxtel, J.J.A., Parker, A.L. & Alais, D. Multisensory congruency as a mechanism for attentional control over perceptual selection. J. Neurosci 29, 11641-11649 (2009) 3. Munhall, K., ten Hove, M., Brammer, M. & Paré, M. Audiovisual Integration of Speech in a Bistable Illusion. Current Biology 19, 735-739 (2009) 4. McGurk, H. & MacDonald, J. Hearing lips and seeing voices. Nature 264, 746-748 (1976) |
|---|
| responsibles | Fabre-Thorpe |
|---|
| |