Human–Computer Interaction | 2019

Detecting Intention Through Motor-Imagery-Triggered Pupil Dilations

 
 
 
 

Abstract


Human–computer interaction systems that bypass manual control can be beneficial for many use cases, including users with severe motor disability. We investigated pupillometry (inferring mental activity via dilations of the pupil) as an interaction method because it is noninvasive, easy to analyse, and increasingly available for practical development. In 3 experiments we investigated the efficacy of using pupillometry to detect imaginary motor movements of the hand. In Experiment 1 we demonstrated that, on average, the pupillary response is greater when the participant is imagining a hand-grasping motion, as compared with the control condition. In Experiment 2 we investigated how imaginary hand-grasping affects the pupillary response over time. In Experiment 3 we employed a simple classifier to demonstrate single-trial detection of imagined motor events using pupillometry. Using the mean pupil diameter of a single trial, accuracy rates as high as 71.25%, were achieved. Implications for the development of a pupillometry-based switch and future directions are discussed.

Volume 34
Pages 113 - 83
DOI 10.1080/07370024.2017.1293540
Language English
Journal Human–Computer Interaction

Full Text