|
Linking speech production and speech perception: Self-other effects on lip reading| old_uid | 6390 |
|---|
| title | Linking speech production and speech perception: Self-other effects on lip reading |
|---|
| start_date | 2009/03/05 |
|---|
| schedule | 10h30 |
|---|
| online | no |
|---|
| summary | Abstract: The central problem in speech research is the dichotomomy between the invariance of the speech code and the surface variability of the phonetic realization of the phonemes. Although several theories have been put forward to explain this apparent dilemma (like the Quantal Theory of Ken Stevens, the Motor Theory of Speech Perception of the Haskins Labs, and the Adaptive Variability Principle of Bjorn Lindblom), very few progress has been made on understanding the cognitive and psychophysical mechanisms linking the speech perception and production systems. In order to address these issues, we investigated the links between speech production and visual lip reading using the self/other effect paradigm. Previous studies in perception of human movement, has shown that subjects are much better predicting the outcomes of movements when they see their own movements, e.g. trying to predict where a dart will
land on a target by looking at the initial portion of the throwing movement. (Knoblich
and Flach, 2001 Psychol. Sci. 12, 467). We extended this idea to lip reading by recording the lips movements with 8 Optotrak markers and running a perception task on a pointlight type of display. German-speaking subjects participated to the experiment and pronounced the non-sense words /aba/, /awa/, /ava/. Since information about lips closure is not presented in the stimuli, the consonant identification cannot be perfect. Although, subjects were significantly above chance level (1/3) in overall. More interesting, subjects were globally better in identifying the phonemes when they saw their own productions than the productions of the others. This is a surprising finding because people rarely see their own lips when talking, which supports the idea that either an internal articulatory simulation may enhance the perception of speech or that the intrinsic knowledge of the
kinematics of their own lips movements improve the identification of the visual outcomes of those movements. |
|---|
| responsibles | Burle, Blanc, Roll |
|---|
| |
|