Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sayeh Sharify is active.

Publication


Featured researches published by Sayeh Sharify.


international symposium on microarchitecture | 2017

Bit-pragmatic deep neural network computing

Jorge Albericio; Alberto Delmas; Patrick Judd; Sayeh Sharify; Gerard O'Leary; Roman Genov; Andreas Moshovos

Deep Neural Networks expose a high degree of parallelism, making them amenable to highly data parallel architectures. However, data-parallel architectures often accept inefficiency in individual computations for the sake of overall efficiency. We show that on average, activation values of convolutional layers during inference in modern Deep Convolutional Neural Networks (CNNs) contain 92% zero bits. Processing these zero bits entails ineffectual computations that could be skipped. We propose Pragmatic (PRA), a massively data-parallel architecture that eliminates most of the ineffectual computations on-the-fly, improving performance and energy efficiency compared to state-of-the-art high-performance accelerators [5]. The idea behind PRA is deceptively simple: use serial-parallel shift-and-add multiplication while skipping the zero bits of the serial input. However, a straightforward implementation based on shift-and-add multiplication yields unacceptable area, power and memory access overheads compared to a conventional bit-parallel design. PRA incorporates a set of design decisions to yield a practical, area and energy efficient design. Measurements demonstrate that for convolutional layers, PRA is 4.31


design automation conference | 2018

Loom: exploiting weight and activation precisions to accelerate convolutional neural networks

Sayeh Sharify; Alberto Delmas Lascorz; Kevin Siu; Patrick Judd; Andreas Moshovos

\times


IEEE Micro | 2018

Value-Based Deep Learning Hardware Accelerators

Andreas Moshovos; Jorge Albericio; Patrick Judd; Alberto Delmas Lascorz; Sayeh Sharify; Tayler H. Hetherington; Tor M. Aamodt; Natalie D. Enright Jerger

faster than DaDianNao [5] (DaDN) using a 16-bit fixed-point representation. While PRA requires 1.68


IEEE Computer | 2018

Exploiting Typical Values to Accelerate Deep Learning

Andreas Moshovos; Jorge Albericio; Patrick Judd; Alberto Delmas Lascorz; Sayeh Sharify; Zissis Poulos; Tayler H. Hetherington; Tor M. Aamodt; Natalie D. Enright Jerger

\times


arXiv: Neural and Evolutionary Computing | 2017

Dynamic Stripes: Exploiting the Dynamic Precision Requirements of Activation Values in Neural Networks.

Alberto Delmas; Patrick Judd; Sayeh Sharify; Andreas Moshovos

more area than DaDN, the performance gains yield a 1.70


arXiv: Neural and Evolutionary Computing | 2017

Tartan: Accelerating Fully-Connected and Convolutional Layers in Deep Learning Networks by Exploiting Numerical Precision Variability

Alberto Delmas Lascorz; Sayeh Sharify; Patrick Judd; Andreas Moshovos

\times


arXiv: Learning | 2017

Cnvlutin2: Ineffectual-Activation-and-Weight-Free Deep Neural Network Computing.

Patrick Judd; Alberto Delmas Lascorz; Sayeh Sharify; Andreas Moshovos

increase in energy efficiency in a 65nm technology. With 8-bit quantized activations, PRA is 2.25


arXiv: Neural and Evolutionary Computing | 2018

Bit-Tactical: Exploiting Ineffectual Computations in Convolutional Neural Networks: Which, Why, and How.

Alberto Delmas; Patrick Judd; Dylan Malone Stuart; Zissis Poulos; Sayeh Sharify; Milos Nikolic; Andreas Moshovos

\times


arXiv: Neural and Evolutionary Computing | 2018

DPRed: Making Typical Activation Values Matter In Deep Learning Computing.

Alberto Delmas; Sayeh Sharify; Patrick Judd; Milos Nikolic; Andreas Moshovos

faster and 1.31


arXiv: Neural and Evolutionary Computing | 2018

Laconic Deep Learning Computing.

Sayeh Sharify; Alberto Delmas Lascorz; Milos Nikolic; Andreas Moshovos

\times

Collaboration


Dive into the Sayeh Sharify's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tayler H. Hetherington

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar

Tor M. Aamodt

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge