Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Xuezhe Ma is active.

Publication


Featured researches published by Xuezhe Ma.


meeting of the association for computational linguistics | 2016

End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF

Xuezhe Ma; Eduard H. Hovy

State-of-the-art sequence labeling systems traditionally require large amounts of task-specific knowledge in the form of hand-crafted features and data pre-processing. In this paper, we introduce a novel neutral network architecture that benefits from both word- and character-level representations automatically, by using combination of bidirectional LSTM, CNN and CRF. Our system is truly end-to-end, requiring no feature engineering or data pre-processing, thus making it applicable to a wide range of sequence labeling tasks. We evaluate our system on two data sets for two sequence labeling tasks --- Penn Treebank WSJ corpus for part-of-speech (POS) tagging and CoNLL 2003 corpus for named entity recognition (NER). We obtain state-of-the-art performance on both the two data --- 97.55\% accuracy for POS tagging and 91.21\% F1 for NER.


meeting of the association for computational linguistics | 2016

Harnessing Deep Neural Networks with Logic Rules

Zhiting Hu; Xuezhe Ma; Zhengzhong Liu; Eduard H. Hovy; Eric P. Xing

Combining deep neural networks with structured logic rules is desirable to harness flexibility and reduce uninterpretability of the neural models. We propose a general framework capable of enhancing various types of neural networks (e.g., CNNs and RNNs) with declarative first-order logic rules. Specifically, we develop an iterative distillation method that transfers the structured information of logic rules into the weights of neural networks. We deploy the framework on a CNN for sentiment analysis, and an RNN for named entity recognition. With a few highly intuitive rules, we obtain substantial improvements and achieve state-of-the-art or comparable results to previous best-performing systems.


meeting of the association for computational linguistics | 2014

Unsupervised Dependency Parsing with Transferring Distribution via Parallel Guidance and Entropy Regularization

Xuezhe Ma; Fei Xia

We present a novel approach for inducing unsupervised dependency parsers for languages that have no labeled training data, but have translated text in a resourcerich language. We train probabilistic parsing models for resource-poor languages by transferring cross-lingual knowledge from resource-rich language with entropy regularization. Our method can be used as a purely monolingual dependency parser, requiring no human translations for the test data, thus making it applicable to a wide range of resource-poor languages. We perform experiments on three Data sets — Version 1.0 and version 2.0 of Google Universal Dependency Treebanks and Treebanks from CoNLL shared-tasks, across ten languages. We obtain stateof-the art performance of all the three data sets when compared with previously studied unsupervised and projected parsing systems.


meeting of the association for computational linguistics | 2017

An Interpretable Knowledge Transfer Model for Knowledge Base Completion

Qizhe Xie; Xuezhe Ma; Zihang Dai; Eduard H. Hovy

Knowledge bases are important resources for a variety of natural language processing tasks but suffer from incompleteness. We propose a novel embedding model, \emph{ITransF}, to perform knowledge base completion. Equipped with a sparse attention mechanism, ITransF discovers hidden concepts of relations and transfer statistical strength through the sharing of concepts. Moreover, the learned associations between relations and concepts, which are represented by sparse attention vectors, can be interpreted easily. We evaluate ITransF on two benchmark datasets---WN18 and FB15k for knowledge base completion and obtains improvements on both the mean rank and Hits@10 metrics, over all baselines that do not use additional information.


north american chapter of the association for computational linguistics | 2016

Unsupervised Ranking Model for Entity Coreference Resolution

Xuezhe Ma; Zhengzhong Liu; Eduard H. Hovy

Coreference resolution is one of the first stages in deep language understanding and its importance has been well recognized in the natural language processing community. In this paper, we propose a generative, unsupervised ranking model for entity coreference resolution by introducing resolution mode variables. Our unsupervised system achieves 58.44% F1 score of the CoNLL metric on the English data from the CoNLL-2012 shared task (Pradhan et al., 2012), outperforming the Stanford deterministic system (Lee et al., 2013) by 3.01%.


workshop on events definition detection coreference and representation | 2015

Word Sense Disambiguation via PropStore and OntoNotes for Event Mention Detection

Nicolas R. Fauceglia; Yiu-Chang Lin; Xuezhe Ma; Eduard H. Hovy

In this paper, we propose a novel approach for Word Sense Disambiguation (WSD) of verbs that can be applied directly in the event mention detection task to classify event types. By using the PropStore, a database of relations between words, our approach disambiguates senses of verbs by utilizing the information of verbs that appear in similar syntactic contexts. Importantly, the resource our approach requires is only a word sense dictionary, without any annotated sentences or structures and relations between different senses (as in WordNet). Our approach can be extended to disambiguate senses of words for parts of speech besides verbs.


empirical methods in natural language processing | 2015

Efficient Inner-to-outer Greedy Algorithm for Higher-order Labeled Dependency Parsing

Xuezhe Ma; Eduard H. Hovy

Many NLP systems use dependency parsers as critical components. Jonit learning parsers usually achieve better parsing accuracies than two-stage methods. However, classical joint parsing algorithms significantly increase computational complexity, which makes joint learning impractical. In this paper, we proposed an efficient dependency parsing algorithm that is capable of capturing multiple edge-label features, while maintaining low computational complexity. We evaluate our parser on 14 different languages. Our parser consistently obtains more accurate results than three baseline systems and three popular, off-the-shelf parsers.


international conference on computational linguistics | 2012

Fourth-Order Dependency Parsing

Xuezhe Ma; Hai Zhao


international conference on learning representations | 2017

Dropout with Expectation-linear Regularization

Xuezhe Ma; Yingkai Gao; Zhiting Hu; Yaoliang Yu; Yuntian Deng; Eduard H. Hovy


international joint conference on natural language processing | 2017

Neural Probabilistic Model for Non-projective MST Parsing

Xuezhe Ma; Eduard H. Hovy

Collaboration


Dive into the Xuezhe Ma's collaboration.

Top Co-Authors

Avatar

Eduard H. Hovy

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Zhengzhong Liu

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Zhiting Hu

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Eric P. Xing

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Fei Xia

University of Washington

View shared research outputs
Top Co-Authors

Avatar

Graham Neubig

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Jingzhou Liu

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Hai Zhao

Shanghai Jiao Tong University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge