Proceedings of the 17th Conference on Embedded Networked Sensor Systems | 2019

MetaSense: few-shot adaptation to untrained conditions in deep mobile sensing

 
 
 
 

Abstract


Recent improvements in deep learning and hardware support offer a new breakthrough in mobile sensing; we could enjoy context-aware services and mobile healthcare on a mobile device powered by artificial intelligence. However, most related studies perform well only with a certain level of similarity between trained and target data distribution, while in practice, a specific user s behaviors and device make sensor inputs different. Consequently, the performance of such applications might suffer in diverse user and device conditions as training deep models in such countless conditions is infeasible. To mitigate the issue, we propose MetaSense, an adaptive deep mobile sensing system utilizing only a few (e.g., one or two) data instances from the target user. MetaSense employs meta learning that learns how to adapt to the target user s condition, by rehearsing multiple similar tasks generated from our unique task generation strategies in offline training. The trained model has the ability to rapidly adapt to the target user s condition when a few data are available. Our evaluation with real-world traces of motion and audio sensors shows that MetaSense not only outperforms the state-of-the-art transfer learning by 18% and meta learning based approaches by 15% in terms of accuracy, but also requires significantly less adaptation time for the target user.

Volume None
Pages None
DOI 10.1145/3356250.3360020
Language English
Journal Proceedings of the 17th Conference on Embedded Networked Sensor Systems

Full Text