Proceedings of the ACM on Human-Computer Interaction | 2019

Designing with Gaze

 
 
 
 
 
 

Abstract


Recent developments in gaze tracking present new opportunities for social computing. This paper presents a study of Tama, a gaze actuated smart speaker. Tama was designed taking advantage of research on gaze in conversation. Rather than being activated with a wake word (such as Ok Google ) Tama detects the gaze of a user, moving an articulated head to achieve mutual gaze. We tested Tama s use in a multi-party conversation task, with users successfully activating and receiving a response to over 371 queries (over 10 trials). When Tama worked well, there was no significant difference in length of interaction. However, interactions with Tama had a higher rate of repeated queries, causing longer interactions overall. Video analysis lets us explain the problems users had interacting with gaze. In the discussion, we describe implications for designing new gaze systems, using gaze both as input and output. We also discuss how the relationship to anthropomorphic design and taking advantage of learned skills of interaction. Finally, two paths for future work are proposed, one in the field of speech agents, and the second in using human gaze as an interaction modality more widely.

Volume 3
Pages 1 - 26
DOI 10.1145/3359278
Language English
Journal Proceedings of the ACM on Human-Computer Interaction

Full Text