Pragmatic and iconic functions of facial expressions in multimodal communication

titlePragmatic and iconic functions of facial expressions in multimodal communication
start_date2023/02/24
schedule16h
onlineno
location_infoRoom 404 & Teams
summaryFacial expressions afford a wide variety of nuanced social messages. However, their large number as well as combinatorial and temporal structure make them difficult to study. Traditionally, they have been studied in a theory- and hypothesis-driven manner, which constrains our understanding and leads to Western-centric biases. Here we present an alternative, more agnostic, data-driven psychophysics approach that overcomes these shortcomings. Our generative platform of dynamic 3D faces allows sampling from the full parameter space including face shape, complexion and face/eye movements. We can thus precisely model dynamic facial expressions of many social messages for any given culture. Analyses of these facial expression models further enables precise characterisation of which face movements are shared or differ cross-culturally, and what social information they convey, including broad information (e.g., valence, arousal) and specific messages (e.g., happy, confused etc.). Beyond affective signals, facial expressions in humans form part of a complex multimodal language system where vocal and visual data streams (speech + facial expressions and gesture) are integrated in real-time to form complex multimodal messages. However, in the language sciences, facial expressions remain understudied and have mostly been studied qualitatively from production data that is ecologically rich but makes it hard to model how specific configurations of facial expressions modulate the perception of multimodal signals. We address this gap by combining our data-driven approach with speech and synchronised lip movements to test how facial expressions combine with speech. Using a psychophysical reverse correlation approach, we have, for instance, identified distinct face movements marking confidence and doubt when answering questions. Additionally, we present evidence from a behavioural task that suggests that specific expressions of the eye and brow area can change the perception of quantity from spoken vague quantifiers such as ‘many’, suggesting that facial expressions can carry some of the meaning of multimodal messages.
responsiblesEsposito