Motion Sound Music Interaction

old_uid13143
titleMotion Sound Music Interaction
start_date2013/12/05
schedule15h15
onlineno
location_infoAmpère
summaryI will present an overview of research performed by the Real-Time Musical Interactions Team of IRCAM in Paris. We have developed various methods and systems that allow for the interaction between gesture, motion and digital media. This research has been influenced by sustained collaborations with musicians/composers and dancers/choreographers. For example, the study of performer movement allowed us to formalize key concepts about continuous gesture control, gesture co-articulation and movement qualities. This guided us in designing various real-time gesture analysis systems using machine learning techniques, such as the gesture follower that enables gesture synchronization with sound synthesis. Concrete applications will be presented in music performance and music pedagogy (using the Modular Musical Objects from the ANR Interlude project) and musical games (Urban Musical Game). Finally, recent research on sensori-motor learning with auditory feedback will be presented, which opens novel perspectives for the design of musical interfaces and medical applications such as rehabilitation (ANR project Legos)
responsiblesLœvenbruck, Welby