Intentional vision

old_uid10109
titleIntentional vision
start_date2011/07/04
schedule11h
onlineno
location_info2è étage, Salle Sabatier B
summaryThe brain has evolved to generate action in the real world and sensory modalities are optimized to support this objective. For action and perception to support survival, behavior must satisfy specific needs and their derived goals. This raises the fundamental question of how the integration of goals and perception occurs. I will investigate this question from the perspective of a neuromimetic robot based cognitive architecture called Distributed Adaptive Control. It assumes that the integration of perception, action and motivation is structured at three distinct levels: reactive, adaptive and contextual. I will in particular look at the adaptive layer where the state space of the environment is constructed and will show how key anatomical features of the visual cortex can give rise to a visual processing system that provides rapid non-hierarchical classification and processing of complex stimuli, like faces, while allowing for state dependent modulation of the processing stream in order to support segmentation using a Temporal Population Code. I will discuss experiments with a rodent based mobile platform and the humanoid robot iCub. Subsequently, I will present an integrated framework that shows how acquired cognitive structures can be integrated with feed-forward visual processing, giving rise to a self-contained real-world intentional vision (and action) system, and will present psychophysical experiments that validate key predictions of this framework.
responsiblesJoelson