Zhiguo Wang
Chinese Academy of Sciences
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Zhiguo Wang.
international joint conference on artificial intelligence | 2017
Zhiguo Wang; Wael Hamza; Radu Florian
Natural language sentence matching is a fundamental technology for a variety of tasks. Previous approaches either match sentences from a single direction or only apply single granular (word-by-word or sentence-by-sentence) matching. In this work, we propose a bilateral multi-perspective matching (BiMPM) model under the matching-aggregation framework. Given two sentences
empirical methods in natural language processing | 2016
Haitao Mi; Baskaran Sankaran; Zhiguo Wang; Abe Ittycheriah
P
meeting of the association for computational linguistics | 2016
Haitao Mi; Zhiguo Wang; Abe Ittycheriah
and
meeting of the association for computational linguistics | 2014
Zhiguo Wang; Nianwen Xue
Q
empirical methods in natural language processing | 2016
Haitao Mi; Zhiguo Wang; Abe Ittycheriah
, our model first encodes them with a BiLSTM encoder. Next, we match the two encoded sentences in two directions
conference on computational natural language learning | 2016
Zhiguo Wang; Haitao Mi; Abraham Ittycheriah
P rightarrow Q
international joint conference on natural language processing | 2015
Zhiguo Wang; Haitao Mi; Nianwen Xue
and
empirical methods in natural language processing | 2016
Linfeng Song; Yue Zhang; Xiaochang Peng; Zhiguo Wang; Daniel Gildea
P leftarrow Q
meeting of the association for computational linguistics | 2017
Linfeng Song; Xiaochang Peng; Yue Zhang; Zhiguo Wang; Daniel Gildea
. In each matching direction, each time step of one sentence is matched against all time-steps of the other sentence from multiple perspectives. Then, another BiLSTM layer is utilized to aggregate the matching results into a fix-length matching vector. Finally, based on the matching vector, the decision is made through a fully connected layer. We evaluate our model on three tasks: paraphrase identification, natural language inference and answer sentence selection. Experimental results on standard benchmark datasets show that our model achieves the state-of-the-art performance on all tasks.
joint conference on lexical and computational semantics | 2016
Linfeng Song; Zhiguo Wang; Haitao Mi; Daniel Gildea
In this paper, we enhance the attention-based neural machine translation (NMT) by adding explicit coverage embedding models to alleviate issues of repeating and dropping translations in NMT. For each source word, our model starts with a full coverage embedding vector to track the coverage status, and then keeps updating it with neural networks as the translation goes. Experiments on the large-scale Chinese-to-English task show that our enhanced model improves the translation quality significantly on various test sets over the strong large vocabulary NMT system.