|
Auditory-Visual Perception and Production of Phones and Tones| old_uid | 5503 |
|---|
| title | Auditory-Visual Perception and Production of Phones and Tones |
|---|
| start_date | 2008/10/31 |
|---|
| schedule | 11h-11h45 |
|---|
| online | no |
|---|
| location_info | amphi G2 |
|---|
| summary | There is evidence, mostly with phones (consonants & vowels), that speech perception is an auditory-visual phenomenon. Here the visual concomitants of lexical tone are considered. In tone languages fundamental frequency variations signal lexical meaning. In a word identification experiment with auditory-visual words differing only in tone, Cantonese perceivers performed above chance in a Visual Only condition. A subsequent study showed augmented word pair discrimination in noise in Auditory-Visual over Auditory Only conditions for Cantonese, tonal Thai speakers, and even non-tone Australian speakers. The source of this perceptual information was sought in an OPTOTRAK production study of a Cantonese speaker. Functional Data Analysis and Principal Component (PC) extraction suggest that the salient PCs to distinguish tones involve rigid motion of the head rather than non-rigid face motion. Results of a final perception study using OPTOTRAK output in which rigid or non-rigid motion could be presented independently in tone or phone differing conditions, suggests that non-rigid motion is most useful for phone discrimination, whereas rigid motion is most useful for tone discrimination |
|---|
| responsibles | Grimault |
|---|
| |
|