Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies | 2021

NeckFace

 
 
 
 
 
 
 
 

Abstract


Facial expressions are highly informative for computers to understand and interpret a person s mental and physical activities. However, continuously tracking facial expressions, especially when the user is in motion, is challenging. This paper presents NeckFace, a wearable sensing technology that can continuously track the full facial expressions using a neck-piece embedded with infrared (IR) cameras. A customized deep learning pipeline called NeckNet based on Resnet34 is developed to learn the captured infrared (IR) images of the chin and face and output 52 parameters representing the facial expressions. We demonstrated NeckFace on two common neck-mounted form factors: a necklace and a neckband (e.g., neck-mounted headphones), which was evaluated in a user study with 13 participants. The study results showed that NeckFace worked well when the participants were sitting, walking, or after remounting the device. We discuss the challenges and opportunities of using NeckFace in real-world applications.

Volume 5
Pages 1 - 31
DOI 10.1145/3463511
Language English
Journal Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies

Full Text