Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems | 2021

Demonstrating TapID for Rapid Touch Interaction on Surfaces in Virtual Reality for Productivity Scenarios

 
 
 
 

Abstract


We demonstrate a novel approach to bring quick touch interaction on surfaces to Virtual Reality, which is a challenge for current camera-based VR headsets that support free-hand mid-air interaction or physical hand-held controllers for input. In our approach, we use our wrist-worn prototype TapID to complement the optical hand tracking from VR headsets with inertial sensing to detect touch events on surfaces, which establishes the same interaction modality that is present on today’s phones and tablets. Each TapID band integrates a pair of inertial sensors in a flexible strap, from whose signals TapID reliably detects surface touch events and identifies the finger used for touch. This event detection is then fused with the optically tracked hand poses to trigger input in VR. Our demonstration comprises a series of VR applications, including UI control in word processors, web browsers, and document editors. These applications showcase the beneficial use of rapid tapping, typing, and surface gestures in Virtual Reality.

Volume None
Pages None
DOI 10.1145/3411763.3451553
Language English
Journal Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems

Full Text