2019 IEEE International Workshop on Signal Processing Systems (SiPS) | 2019

Improving Reliability of ReRAM-Based DNN Implementation through Novel Weight Distribution

 
 
 

Abstract


Binary deep neural networks, that have been implemented in resistive random access memory (ReRAM) for storage efficiency, suffer from poor recognition performance in the presence of hardware errors. This paper addresses this problem by deriving a novel weight distribution and representation scheme that mitigates errors due to faulty ReRAM cells with minimal storage overhead. In the proposed scheme, the weight matrix is partitioned into grains, and each weight in a grain is represented by the sum of a multi-bit mean and a 1-bit deviation. The grain size as well as the mean to deviation ratio of the weights in a grain can be chosen such that the network is resilient to hardware errors. A hybrid processing-in-memory (PIM) architecture is proposed to support this scheme. The mean values are stored in a small SRAM and processed by a CMOS unit, and the deviations are stored and processed by the ReRAM unit. Compared to the baseline binary neural network which fails in the presence of severe hardware errors, the proposed hybrid scheme has only a mild recognition performance degradation. Simulation results show the proposed scheme achieves 97.84% test accuracy (a 0.84% accuracy drop) on a MNIST dataset, and 88.07% test accuracy (a 1.10% accuracy drop) on a CIFAR-10 dataset under 9.04% stuck-at-1 and 1.75% stuck-at-0 faults.

Volume None
Pages 189-194
DOI 10.1109/SiPS47522.2019.9020318
Language English
Journal 2019 IEEE International Workshop on Signal Processing Systems (SiPS)

Full Text