Signal integration in shape perception through active touch

old_uid51
titleSignal integration in shape perception through active touch
start_date2005/10/10
schedule11h-13h
onlineno
location_infosalle 3037
detailsInvité par Florian Waszak (Equipe Perception Visuelle)
summaryEveryday perception is based on multiple sensory signals, many of which provide information about the same physical property in the world. For instance, we can both see and feel the shape of an object that we hold in our hands. The integration of such signals into a unitary percept has been well described by the Maximum-Likelihood-Estimate? <tiki-editpage.php?page=Maximum-Likelihood-Estimate> (MLE) model – for passive perception. We explored signal integration during active touch using a recent distinction between position and force signals for shape (Robles-de-la-Torre & Hayward, 2001). When sliding a finger across a bumpy surface, the finger follows the surface geometry (position cue). At the same time the finger is exposed to forces related to the slope of the surface (force cue). We systematically disentangled force and position signals to the curvature of 3D-arches and demonstrate that observers use both force and position signals integrating them by weighted averaging into a percept. Further studies, where we manipulated the material properties of the arches, indicate that the weights of the signals systematically co-vary with reliability. These results confirm predictions of the MLE model and, thus, extend it to situations involving active touch.
responsiblesCohen