|
Sensorimotor Integration| old_uid | 15932 |
|---|
| title | Sensorimotor Integration |
|---|
| start_date | 2018/05/29 |
|---|
| schedule | 17h-19h |
|---|
| online | no |
|---|
| location_info | salle Rousselot |
|---|
| details | thème : Recent Developments in Articulatory Phonology |
|---|
| summary | Classic AP and task dynamics hypothesized that the goals of gestures (phonological units) were all defined with respect to constriction formation within the vocal tract, rather than its auditory (and visual) consequences, and the production of gestures was modeled without use of auditory feedback. Over the last 15 years, much has been learned about the role of sensory feedback (both actual and generated by an internal “forward model”) in speech production, though behavioral and neural experiments using altered feedback. These findings, which will be reviewed in this lecture, serve to sharpen the concept of gesture as a coordinated unit of articulatory action that achieves a goal, where the goal can be defined in a variety of different possible reference frames (e.g., auditory, acoustic, areodynamic, geometric). Variation of goal as a function of gesture type and possibly as a function of language will be presented, as will a new version of TaDA that can adapt to altered auditory feedback, as speakers do. More generally, recent studies using electrocorticography have revealed cortical maps of phonetic units in motor and sensory areas during both speech perception and production that show substantial topological differences. While this appears to leave auditory and motor representations untethered to one another, a new representation will be introduced, the Spatiotemporal Modulation Signature, which shows closely correlated temporal patterns of articulatory and auditory/acoustic changes over the course of an utterance, and can serve as a basis for sensorimotor integration. |
|---|
| responsibles | Isel |
|---|
| |
|