13th International Conference on Automotive User Interfaces and Interactive Vehicular Applications | 2021

Advancing In-vehicle Gesture Interactions with Adaptive Hand-Recognition and Auditory Displays

 
 
 
 

Abstract


Competition for visual attention in vehicles has increased with the integration of touch-based interfaces, which has led to an increased crash risk. To mitigate this visual distraction, we designed an in-vehicle gesture-based menu system with different auditory feedback types and hand-recognition systems. We are conducting an experiment using a driving simulator where the participant performs a secondary task of selecting a menu item. Three auditory feedback types are tested in addition to the baseline condition (no audio): auditory icons, earcons, and spearcons. For each type of auditory display, two hand-recognition systems are tested: fixed and adaptive. We expect we can reduce the driver’s secondary task workload, while minimizing off-road glances for safety. Our experiment would contribute to the existing literature in multimodal signal processing, confirming the Multiple Resource Theory. It would also present practical design guidelines for auditory-feedback for gesture-based in-vehicle interactions.

Volume None
Pages None
DOI 10.1145/3473682.3481870
Language English
Journal 13th International Conference on Automotive User Interfaces and Interactive Vehicular Applications

Full Text