2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) | 2019

A multiscript gaze-based assistive virtual keyboard

 
 
 
 
 

Abstract


The recent development of inexpensive and accurate eye-trackers allows the creation of gazed based virtual keyboards that can be used by a large population of disabled people in developing countries. Thanks to eye-tracking technology, gaze-based virtual keyboards can be designed in relation to constraints related to the gaze detection accuracy and the considered display device. In this paper, we propose a new multimodal multiscript gaze-based virtual keyboard where it is possible to change the layout of the graphical user interface in relation to the script. Traditionally, virtual keyboards are assessed for a single language (e.g. English). We propose a multiscript gaze based virtual keyboard that can be accessed for people who communicate with the Latin, Bangla, and/or Devanagari scripts. We evaluate the performance of the virtual keyboard with two main groups of participants: 28 people who can communicate with both Bangla and English, and 24 people who can communicate with both Devanagari and English. The performance is assessed in relation to the information transfer rate when participants had to spell a sentence using their gaze for pointing to the command, and a dedicated mouth switch for commands selection. The results support the conclusion that the system is efficient, with no difference in terms of information transfer rate between Bangla and Devanagari. However, the performance is higher with English, despite the fact it was the secondary language of the participants.

Volume None
Pages 1306-1309
DOI 10.1109/EMBC.2019.8856446
Language English
Journal 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)

Full Text