Material perception from motion and cross-modal integration

old_uid14762
titleMaterial perception from motion and cross-modal integration
start_date2014/12/04
schedule11h15
onlineno
location_infosalle Jules Ferry
summaryTraditional vision science takes a relatively simple view that basic visual attributes such as color, shape, depth and motion are separately analyzed in early visual cortical areas, with their outputs being later combined into object representations. Recent advancement in image processing technology, however, has had us convinced that the mere combination of the basic visual attributes is far from sufficient to represent real world visual scenes. Material perception is the remaining hard problem of sensory science that we should challenge to fill the gap between our laboratories and the real world. Most research on human visual recognition focuses on solid objects, whose identity is defined primarily by shape. In daily life, however, we often encounter materials that have no specific form, including liquids whose shape changes dynamically over time. We recently found that human observers can recognize liquids and their viscosities solely from image motion flow information, or from dynamic deformation information. This is the first topic of my talk. In the latter half, I will talk about cross-modal integration of material information. For instance, I will discuss how the brain integrates visual appearance and impact sound when inferring what an object is made from.
responsibles<not specified>
speakers