Perceptual Interfaces and Vision Based Interaction

old_uid277
titlePerceptual Interfaces and Vision Based Interaction
start_date2005/11/30
schedule11h-12h30
onlineno
summaryAlthough computer technology has been improving exponentially for decades, there has apparently not been commensurate progress in human-computer interaction - we still interact with the same devices and styles that we used two decades ago. The GUI-based style of interaction has made computers simpler and easier to use, especially for office productivity applications where computers are used as tools to accomplish well-defined tasks. However, as computers become more ubiquitous, take on a wide variety of form factors, and are used in a variety of new ways, these standard interaction techniques will no longer adequately support the needs of users. In order to accommodate a wider range of scenarios, tasks, users, and preferences, we need to move toward interfaces that are natural, intuitive, adaptive, and unobtrusive. In recent years, perceptual and multimodal interfaces have emerged as an increasingly important research direction to address the need for new interaction models and technologies. The general focus of this area is to integrate multiple perceptual modalities (such as computer vision, speech and sound processing, and haptic I/O) in order to provide interactions not feasible with standard interfaces. In this talk, I will describe this emerging area of research, especially focusing on computer vision as an input modality, and then proceed to discuss several relevant projects in the Four Eyes Lab at the University of California, Santa Barbara.
responsiblesParoubek, Turner