Mo Yu
IBM
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Mo Yu.
meeting of the association for computational linguistics | 2017
Mo Yu; Wenpeng Yin; Kazi Saidul Hasan; Cícero Nogueira dos Santos; Bing Xiang; Bowen Zhou
Relation detection is a core component for many NLP applications including Knowledge Base Question Answering (KBQA). In this paper, we propose a hierarchical recurrent neural network enhanced by residual learning that detects KB relations given an input question. Our method uses deep residual bidirectional LSTMs to compare questions and relation names via different hierarchies of abstraction. Additionally, we propose a simple KBQA system that integrates entity linking and our proposed relation detector to enable one enhance another. Experimental results evidence that our approach achieves not only outstanding relation detection performance, but more importantly, it helps our KBQA system to achieve state-of-the-art accuracy for both single-relation (SimpleQuestions) and multi-relation (WebQSP) QA benchmarks.
empirical methods in natural language processing | 2016
Gakuto Kurata; Bing Xiang; Bowen Zhou; Mo Yu
Recurrent Neural Network (RNN) and one of its specific architectures, Long Short-Term Memory (LSTM), have been widely used for sequence labeling. In this paper, we first enhance LSTM-based sequence labeling to explicitly model label dependencies. Then we propose another enhancement to incorporate the global information spanning over the whole input sequence. The latter proposed method, encoder-labeler LSTM, first encodes the whole input sequence into a fixed length vector with the encoder LSTM, and then uses this encoded vector as the initial state of another LSTM for sequence labeling. Combining these methods, we can predict the label sequence with considering label dependencies and information of whole input sequence. In the experiments of a slot filling task, which is an essential component of natural language understanding, with using the standard ATIS corpus, we achieved the state-of-the-art F1-score of 95.66%.
international joint conference on artificial intelligence | 2018
Wenhan Xiong; Xiaoxiao Guo; Mo Yu; Shiyu Chang; Bowen Zhou; William Yang Wang
We investigate the task of learning to follow natural language instructions by jointly reasoning with visual observations and language inputs. In contrast to existing methods which start with learning from demonstrations (LfD) and then use reinforcement learning (RL) to fine-tune the model parameters, we propose a novel policy optimization algorithm which dynamically schedules demonstration learning and RL. The proposed training paradigm provides efficient exploration and better generalization beyond existing methods. Comparing to existing ensemble models, the best single model based on our proposed method tremendously decreases the execution error by over 50% on a block-world environment. To further illustrate the exploration strategy of our RL algorithm, We also include systematic studies on the evolution of policy entropy during training.
arXiv: Artificial Intelligence | 2018
Yang Yu; Kazi Saidul Hasan; Mo Yu; Wei Zhang; Zhiguo Wang
Relation detection is a core component for Knowledge Base Question Answering (KBQA). In this paper, we propose a KB relation detection model via multi-view matching which utilizes more useful information extracted from question and KB. The matching inside each view is through multiple perspectives to compare two input texts thoroughly. All these components are designed in an end-to-end trainable neural network model. Experiments on SimpleQuestions and WebQSP yield state-of-the-art results.
international conference on learning representations | 2017
Zhouhan Lin; Minwei Feng; Cícero Nogueira dos Santos; Mo Yu; Bing Xiang; Bowen Zhou; Yoshua Bengio
arXiv: Computation and Language | 2017
Wenpeng Yin; Katharina Kann; Mo Yu; Hinrich Schütze
arXiv: Computation and Language | 2017
Yang Yu; Wei Zhang; Bowen Zhou; Kazi Saidul Hasan; Mo Yu; Bing Xiang
international conference on computational linguistics | 2016
Wenpeng Yin; Mo Yu; Bing Xiang; Bowen Zhou; Hinrich Schütze
neural information processing systems | 2017
Shiyu Chang; Yang Zhang; Wei Han; Mo Yu; Xiaoxiao Guo; Wei Tan; Xiaodong Cui; Michael J. Witbrock; Mark Hasegawa-Johnson; Thomas S. Huang
arXiv: Computation and Language | 2017
Shuohang Wang; Mo Yu; Xiaoxiao Guo; Zhiguo Wang; Tim Klinger; Wei Zhang; Shiyu Chang; Gerald Tesauro; Bowen Zhou; Jing Jiang