|
Advanced Controls and Displays Group| old_uid | 2497 |
|---|
| title | Advanced Controls and Displays Group |
|---|
| start_date | 2007/03/22 |
|---|
| schedule | 15h |
|---|
| online | no |
|---|
| details | Horaire inhabituel |
|---|
| summary | This discussion will review basic ideas about auditory-haptic interaction; the results of studies on auditory-tactile asynchrony; and NASA applications to astronaut tool manipulation during extra-vehicular activities. Realistic simulation and perceived "immersion" within a multimodal display for telecommunication and entertainment can be enhanced, or completely degraded, as a function of inter-modal timing asynchronies between the multimodal rendering systems (system-system asynchrony). Tolerable asynchronies are have been extensively examined for auditory-visual stimuli but less so for auditory-haptic cues. Data are presented from two experiments where auditory stimuli are varied in time of arrival (lead-lag) relative to a tactile pulse. Results indicate variability between participants in terms of overall thresholds. Generally, thresholds are lower when auditory stimuli lead haptic stimuli, compared to the opposite condition. Applications of the work include substitution of auditory for haptic cues in the conduct of complex activities in high-stress human interfaces. |
|---|
| responsibles | Paroubek, Turner |
|---|
| |
|