2021 IEEE/ACM International Symposium on Low Power Electronics and Design (ISLPED) | 2021

Gesture-SNN: Co-optimizing accuracy, latency and energy of SNNs for neuromorphic vision sensors

 
 
 
 
 
 

Abstract


Spiking neural networks (SNNs) are recently gaining popularity due to their low-power, spatio-temporal computing paradigm as opposed to more conventional deep learning approaches that mainly focus on spatial characteristics of data. When paired with biologically-inspired asynchronous event sensors, they can create energy-efficient near-sensor systems that are ideal for mobile, resource-constrained and embedded-computing scenarios. Training deep SNNs, however, is challenging due to their discrete nature. The most successful method so far involves training deep artificial neural networks (ANNs) using Gradient-Descent and then converting them to SNNs. The ANN-to-SNN conversion technique has mostly been evaluated on standard static image datasets using rate-based encoding of spikes. In this work, we find that a direct application of the ANN-to-SNN conversion technique to process event data via SNNs leads to arbitrary accuracy losses. Through insights gained from theoretical analyses as well as empirical observations, we propose three novel techniques to restore the conversion accuracy on event data and show proof-of-concept results, comparable to the state-of-the-art, on the IBM DVS Gesture dataset. Further exploration of the SNN design space reveals additional insights to fine-tune the accuracy-latency-peak power trade-off. Finally, we evaluate our proposed schemes on an existing neuromorphic accelerator and show that our best-performing model is $\\sim 38$% more accurate with $\\sim 35$% lower energy and $\\sim 55$% lower EDP compared to its traditional SNN counterpart.

Volume None
Pages 1-6
DOI 10.1109/ISLPED52811.2021.9502506
Language English
Journal 2021 IEEE/ACM International Symposium on Low Power Electronics and Design (ISLPED)

Full Text