2021 26th Asia and South Pacific Design Automation Conference (ASP-DAC) | 2021

Attention-in-Memory for Few-Shot Learning with Configurable Ferroelectric FET Arrays

 
 
 
 

Abstract


Attention-in-Memory (AiM), a computing-in-memory (CiM) design, is introduced to implement the attentional layer of Memory Augmented Neural Networks (MANNs). AiM consists of a memory array based on Ferroelectric FETs (FeFET) along with CMOS peripheral circuits implementing configurable functionalities, i.e., it can be dynamically changed from a ternary content-addressable memory (TCAM) to a general-purpose (GP) CiM. When compared to state-of-the art accelerators, AiM achieves comparable end-to-end speed-up and energy for MANNs, with better accuracy (95.14% v.s. 92.21%, and 95.14% v.s. 91.98%) at iso-memory size, for a 5-way 5-shot inference task with the Omniglot dataset.

Volume None
Pages 49-54
DOI 10.1145/3394885.3431526
Language English
Journal 2021 26th Asia and South Pacific Design Automation Conference (ASP-DAC)

Full Text