2019 IEEE 16th International Conference on Wearable and Implantable Body Sensor Networks (BSN) | 2019

Assessing Individual Dietary Intake in Food Sharing Scenarios with a 360 Camera and Deep Learning

 
 
 

Abstract


A novel vision-based approach for estimating individual dietary intake in food sharing scenarios is proposed in this paper, which incorporates food detection, face recognition and hand tracking techniques. The method is validated using panoramic videos which capture subjects eating episodes. The results demonstrate that the proposed approach is able to reliably estimate food intake of each individual as well as the food eating sequence. To identify the food items ingested by the subject, a transfer learning approach is designed. 4, 200 food images with segmentation masks, among which 1,500 are newly annotated, are used to fine-tune the deep neural network for the targeted food intake application. In addition, a method for associating detected hands with subjects is developed and the outcomes of face recognition are refined to enable the quantification of individual dietary intake in communal eating settings.

Volume None
Pages 1-4
DOI 10.1109/BSN.2019.8771095
Language English
Journal 2019 IEEE 16th International Conference on Wearable and Implantable Body Sensor Networks (BSN)

Full Text