Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yunning Li is active.

Publication


Featured researches published by Yunning Li.


Nature Communications | 2017

A novel true random number generator based on a stochastic diffusive memristor

Hao Jiang; Daniel Belkin; Sergey Savel’ev; Siyan Lin; Zhongrui Wang; Yunning Li; Saumil Joshi; Rivu Midya; Can Li; Mingyi Rao; Mark Barnell; Qing Wu; Jianhua Yang; Qiangfei Xia

The intrinsic variability of switching behavior in memristors has been a major obstacle to their adoption as the next generation of universal memory. On the other hand, this natural stochasticity can be valuable for hardware security applications. Here we propose and demonstrate a novel true random number generator utilizing the stochastic delay time of threshold switching in a Ag:SiO2 diffusive memristor, which exhibits evident advantages in scalability, circuit complexity, and power consumption. The random bits generated by the diffusive memristor true random number generator pass all 15 NIST randomness tests without any post-processing, a first for memristive-switching true random number generators. Based on nanoparticle dynamic simulation and analytical estimates, we attribute the stochasticity in delay time to the probabilistic process by which Ag particles detach from a Ag reservoir. This work paves the way for memristors in hardware security applications for the era of the Internet of Things.Memristors can switch between high and low electrical-resistance states, but the switching behaviour can be unpredictable. Here, the authors harness this unpredictability to develop a memristor-based true random number generator that uses the stochastic delay time of threshold switching


Advanced Materials | 2018

Memristor‐Based Analog Computation and Neural Network Classification with a Dot Product Engine

Miao Hu; Catherine Graves; Can Li; Yunning Li; Ning Ge; Eric Montgomery; Noraica Davila; Hao Jiang; R. Stanley Williams; Jianhua Yang; Qiangfei Xia; John Paul Strachan

Using memristor crossbar arrays to accelerate computations is a promising approach to efficiently implement algorithms in deep neural networks. Early demonstrations, however, are limited to simulations or small-scale problems primarily due to materials and device challenges that limit the size of the memristor crossbar arrays that can be reliably programmed to stable and analog values, which is the focus of the current work. High-precision analog tuning and control of memristor cells across a 128 × 64 array is demonstrated, and the resulting vector matrix multiplication (VMM) computing precision is evaluated. Single-layer neural network inference is performed in these arrays, and the performance compared to a digital approach is assessed. Memristor computing system used here reaches a VMM accuracy equivalent of 6 bits, and an 89.9% recognition accuracy is achieved for the 10k MNIST handwritten digit test set. Forecasts show that with integrated (on chip) and scaled memristors, a computational efficiency greater than 100 trillion operations per second per Watt is possible.


Nature Communications | 2018

Efficient and self-adaptive in-situ learning in multilayer memristor neural networks

Can Li; Daniel Belkin; Yunning Li; Peng Yan; Miao Hu; Ning Ge; Hao Jiang; Eric Montgomery; Peng Lin; Zhongrui Wang; Wenhao Song; John Paul Strachan; Mark Barnell; Qing Wu; R. Stanley Williams; Jianhua Yang; Qiangfei Xia

Memristors with tunable resistance states are emerging building blocks of artificial neural networks. However, in situ learning on a large-scale multiple-layer memristor network has yet to be demonstrated because of challenges in device property engineering and circuit integration. Here we monolithically integrate hafnium oxide-based memristors with a foundry-made transistor array into a multiple-layer neural network. We experimentally demonstrate in situ learning capability and achieve competitive classification accuracy on a standard machine learning dataset, which further confirms that the training algorithm allows the network to adapt to hardware imperfections. Our simulation using the experimental parameters suggests that a larger network would further increase the classification accuracy. The memristor neural network is a promising hardware platform for artificial intelligence with high speed-energy efficiency.Memristor-based neural networks hold promise for neuromorphic computing, yet large-scale experimental execution remains difficult. Here, Xia et al. create a multi-layer memristor neural network with in-situ machine learning and achieve competitive image classification accuracy on a standard dataset.


Nature Communications | 2018

Capacitive neural network with neuro-transistors

Zhongrui Wang; Mingyi Rao; Jin-Woo Han; J. W. Zhang; Peng Lin; Yunning Li; Can Li; Wenhao Song; Shiva Asapu; Rivu Midya; Ye Zhuo; Hao Jiang; Jung Ho Yoon; Navnidhi K. Upadhyay; Saumil Joshi; Miao Hu; John Paul Strachan; Mark Barnell; Qing Wu; Huaqiang Wu; Qinru Qiu; R. Stanley Williams; Qiangfei Xia; Jianhua Yang

Experimental demonstration of resistive neural networks has been the recent focus of hardware implementation of neuromorphic computing. Capacitive neural networks, which call for novel building blocks, provide an alternative physical embodiment of neural networks featuring a lower static power and a better emulation of neural functionalities. Here, we develop neuro-transistors by integrating dynamic pseudo-memcapacitors as the gates of transistors to produce electronic analogs of the soma and axon of a neuron, with “leaky integrate-and-fire” dynamics augmented by a signal gain on the output. Paired with non-volatile pseudo-memcapacitive synapses, axa0Hebbian-like learning mechanism is implemented in a capacitive switching network, leading to the observed associative learning. A prototypical fully integrated capacitive neural network is built and used to classify inputs of signals.Though memristors can potentially emulate neuron and synapse functionality, useful signal energy is lost to Joule heating. Here, the authors demonstrate neuro-transistors with a pseudo-memcapacitive gate that actively process signals via energy-efficient capacitively-coupled neural networks.


Nature Electronics | 2018

Analogue signal and image processing with large memristor crossbars

Can Li; Miao Hu; Yunning Li; Hao Jiang; Ning Ge; Eric Montgomery; J. W. Zhang; Wenhao Song; Noraica Davila; Catherine Graves; Zhiyong Li; John Paul Strachan; Peng Lin; Zhongrui Wang; Mark Barnell; Qing Wu; R. Stanley Williams; Jianhua Yang; Qiangfei Xia


Nature Electronics | 2018

Fully memristive neural networks for pattern classification with unsupervised learning

Zhongrui Wang; Saumil Joshi; Sergey Savel’ev; Wenhao Song; Rivu Midya; Yunning Li; Mingyi Rao; Peng Yan; Shiva Asapu; Ye Zhuo; Hao Jiang; Peng Lin; Can Li; Jung Ho Yoon; Navnidhi K. Upadhyay; J. W. Zhang; Miao Hu; John Paul Strachan; Mark Barnell; Qing Wu; Huaqiang Wu; R. Stanley Williams; Qiangfei Xia; Jianhua Yang


Nature Electronics | 2018

A provable key destruction scheme based on memristive crossbar arrays

Hao Jiang; Can Li; Rui Zhang; Peng Yan; Peng Lin; Yunning Li; Jianhua Yang; Daniel E. Holcomb; Qiangfei Xia


international symposium on circuits and systems | 2018

Large Memristor Crossbars for Analog Computing

Can Li; Yunning Li; Hao Jiang; Wenhao Song; Peng Lin; Zhongrui Wang; Jianhua Yang; Qiangfei Xia; Miao Hu; Eric Montgomery; J. W. Zhang; Noraica Davila; Catherine Graves; Zhiyong Li; John Paul Strachan; R. Stanley Williams; Ning Ge; Mark Barnell; Qing Wu


international symposium on circuits and systems | 2018

Unconventional computing with diffusive memristors

Zhongrui Wang; Rivu Midya; Saumil Joshi; Hao Jiang; Can Li; Peng Lin; Wenhao Song; Mingyi Rao; Yunning Li; Mark Barnell; Qing Wu; Qiangfei Xia; Jianhua Yang


international memory workshop | 2018

In-Memory Computing with Memristor Arrays

Can Li; Daniel Belkin; Yunning Li; Peng Yan; Miao Hu; Ning Ge; Hao Jiang; Eric Montgomery; Peng Lin; Zhonguir Wang; John Paul Strachan; Mark Barnell; Qing Wu; R. Stanley Williams; Jianhua Yang; Qiangfei Xia

Collaboration


Dive into the Yunning Li's collaboration.

Top Co-Authors

Avatar

Can Li

University of Massachusetts Amherst

View shared research outputs
Top Co-Authors

Avatar

Hao Jiang

University of Massachusetts Amherst

View shared research outputs
Top Co-Authors

Avatar

Jianhua Yang

University of Massachusetts Amherst

View shared research outputs
Top Co-Authors

Avatar

Qiangfei Xia

University of Massachusetts Amherst

View shared research outputs
Top Co-Authors

Avatar

Mark Barnell

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Peng Lin

University of Massachusetts Amherst

View shared research outputs
Top Co-Authors

Avatar

Qing Wu

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Zhongrui Wang

University of Massachusetts Amherst

View shared research outputs
Researchain Logo
Decentralizing Knowledge