Journal on Multimodal User Interfaces | 2019

Interactive gaze and finger controlled HUD for cars

 
 
 
 
 
 
 

Abstract


Modern infotainment systems in automobiles facilitate driving at the cost of secondary tasks in addition to the primary task of driving. These secondary tasks have considerable chance to distract a driver from his primary driving task, thereby reducing safety or increasing cognitive workload. This paper presents an intelligent interactive head up display (HUD) on the windscreen of the driver that does not require them to take eyes off from road while undertaking secondary tasks like playing music, operating vent controls, watching navigation map and so on. The interactive HUD allows interaction in the form of pointing and selection just like traditional graphical user interfaces, however tracking operators’ eye gaze or finger movements. Additionally, the system can also estimate drivers’ cognitive load and distraction level. User studies show the system improves driving performance in terms of mean deviation from lane in an ISO 26022 lane changing task compared to touchscreen system and participants can undertake ISO 9241 pointing tasks in less than 2\xa0s on average inside a Toyota Etios car.

Volume 14
Pages 101-121
DOI 10.1007/s12193-019-00316-9
Language English
Journal Journal on Multimodal User Interfaces

Full Text