Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yunchen Pu is active.

Publication


Featured researches published by Yunchen Pu.


computer vision and pattern recognition | 2017

Semantic Compositional Networks for Visual Captioning

Zhe Gan; Chuang Gan; Xiaodong He; Yunchen Pu; Kenneth Tran; Jianfeng Gao; Lawrence Carin; Li Deng

A Semantic Compositional Network (SCN) is developed for image captioning, in which semantic concepts (i.e., tags) are detected from the image, and the probability of each tag is used to compose the parameters in a long short-term memory (LSTM) network. The SCN extends each weight matrix of the LSTM to an ensemble of tag-dependent weight matrices. The degree to which each member of the ensemble is used to generate an image caption is tied to the image-dependent probability of the corresponding tag. In addition to captioning images, we also extend the SCN to generate captions for video clips. We qualitatively analyze semantic composition in SCNs, and quantitatively evaluate the algorithm on three benchmark datasets: COCO, Flickr30k, and Youtube2Text. Experimental results show that the proposed method significantly outperforms prior state-of-the-art approaches, across multiple evaluation metrics.


meeting of the association for computational linguistics | 2017

Scalable Bayesian Learning of Recurrent Neural Networks for Language Modeling

Zhe Gan; Chunyuan Li; Changyou Chen; Yunchen Pu; Qinliang Su; Lawrence Carin

Recurrent neural networks (RNNs) have shown promising performance for language modeling. However, traditional training of RNNs using back-propagation through time often suffers from overfitting. One reason for this is that stochastic optimization (used for large training sets) does not provide good estimates of model uncertainty. This paper leverages recent advances in stochastic gradient Markov Chain Monte Carlo (also appropriate for large training sets) to learn weight uncertainty in RNNs. It yields a principled Bayesian learning algorithm, adding gradient noise during training (enhancing exploration of the model-parameter space) and model averaging when testing. Extensive experiments on various RNN models and across a broad range of applications demonstrate the superiority of the proposed approach relative to stochastic optimization.


computer vision and pattern recognition | 2016

Learning Weight Uncertainty with Stochastic Gradient MCMC for Shape Classification

Chunyuan Li; Andrew Stevens; Changyou Chen; Yunchen Pu; Zhe Gan; Lawrence Carin

Learning the representation of shape cues in 2D & 3D objects for recognition is a fundamental task in computer vision. Deep neural networks (DNNs) have shown promising performance on this task. Due to the large variability of shapes, accurate recognition relies on good estimates of model uncertainty, ignored in traditional training of DNNs, typically learned via stochastic optimization. This paper leverages recent advances in stochastic gradient Markov Chain Monte Carlo (SG-MCMC) to learn weight uncertainty in DNNs. It yields principled Bayesian interpretations for the commonly used Dropout/DropConnect techniques and incorporates them into the SG-MCMC framework. Extensive experiments on 2D & 3D shape datasets and various DNN models demonstrate the superiority of the proposed approach over stochastic optimization. Our approach yields higher recognition accuracy when used in conjunction with Dropout and Batch-Normalization.


Microscopy and Microanalysis | 2016

Compressive STEM-EELS

Andrew Stevens; Libor Kovarik; Hao Yang; Yunchen Pu; Lawrence Carin; Nigel D. Browning

The collection of electron energy loss spectra (EELS) via scanning transmission electron microscopy (STEM) generally requires a specimen to withstand a large radiation dose. Moreover, significant drift can occur while the spectra are collected. Recent advances in electron microscopy have shown that a data reduction of up to 90% is possible for HAADF/ABF imaging and TEM video [1, 2, 3]. These advances depend on the mathematical theory of compressive sensing (CS) [4]. For STEM [1], the method in [5] (BPFA) was used for a special case of CS called inpainting; the pixels of the image were missing at random. The goal of the inpainting task is to fill-in missing pixels.


Microscopy and Microanalysis | 2016

Compressive Sensing in Microscopy: a Tutorial

Andrew Stevens; Hao Yang; Libor Kovarik; Xin Yuan; Quentin M. Ramasse; Patricia Abellan; Yunchen Pu; Lawrence Carin; Nigel D. Browning

Currently many types of microscopy are limited, in terms of spatial and temporal resolution, by hardware (e.g., camera framerate, data transfer rate, data storage capacity). The obvious approach to solve the resolution problem is to develop better hardware. An alternative solution, which additionally benefits from improved hardware, is to apply compressive sensing (CS) [1]. CS approaches have been shown to reduce dose by as much as 90% in electron microscopy [2, 3, 4]. Optical imaging and microscopy have also seen substantial benefits [5, 6, 7, 8, 9, 10, 11, 12, 13].


neural information processing systems | 2016

Variational Autoencoder for Deep Learning of Images, Labels and Captions

Yunchen Pu; Zhe Gan; Ricardo Henao; Xin Yuan; Chunyuan Li; Andrew Stevens; Lawrence Carin


neural information processing systems | 2017

ALICE: Towards Understanding Adversarial Learning for Joint Distribution Matching.

Chunyuan Li; Hao Liu; Changyou Chen; Yunchen Pu; Liqun Chen; Ricardo Henao; Lawrence Carin


neural information processing systems | 2017

Adversarial Symmetric Variational Autoencoder

Yunchen Pu; Weiyao Wang; Ricardo Henao; Liqun Chen; Zhe Gan; Chunyuan Li; Lawrence Carin


neural information processing systems | 2017

Triangle Generative Adversarial Networks

Zhe Gan; Liqun Chen; Weiyao Wang; Yunchen Pu; Yizhe Zhang; Hao Liu; Chunyuan Li; Lawrence Carin


neural information processing systems | 2017

VAE Learning via Stein Variational Gradient Descent

Yunchen Pu; Zhe Gan; Ricardo Henao; Chunyuan Li; Shaobo Han; Lawrence Carin

Collaboration


Dive into the Yunchen Pu's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrew Stevens

Pacific Northwest National Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge