Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Hiroshi Noji is active.

Publication


Featured researches published by Hiroshi Noji.


international joint conference on natural language processing | 2015

Optimal Shift-Reduce Constituent Parsing with Structured Perceptron

Le Quang Thang; Hiroshi Noji; Yusuke Miyao

We present a constituent shift-reduce parser with a structured perceptron that finds the optimal parse in a practical runtime. The key ideas are new feature templates that facilitate state merging of dynamic programming and A* search. Our system achieves 91.1 F1 on a standard English experiment, a level which cannot be reached by other beam-based systems even with large beam sizes.1


empirical methods in natural language processing | 2016

Using Left-corner Parsing to Encode Universal Structural Constraints in Grammar Induction.

Hiroshi Noji; Yusuke Miyao; Mark Johnson

Center-embedding is difficult to process and is known as a rare syntactic construction across languages. In this paper we describe a method to incorporate this assumption into the grammar induction tasks by restricting the search space of a model to trees with limited centerembedding. The key idea is the tabulation of left-corner parsing, which captures the degree of center-embedding of a parse via its stack depth. We apply the technique to learning of famous generative model, the dependency model with valence (Klein and Manning, 2004). Cross-linguistic experiments on Universal Dependencies show that often our method boosts the performance from the baseline, and competes with the current state-ofthe-art model in a number of languages.


meeting of the association for computational linguistics | 2016

Jigg: A Framework for an Easy Natural Language Processing Pipeline

Hiroshi Noji; Yusuke Miyao

We present Jigg, a Scala (or JVMbased) NLP annotation pipeline framework, which is easy to use and is extensible. Jigg supports a very simple interface similar to Stanford CoreNLP, the most successful NLP pipeline toolkit, but has more flexibility to adapt to new types of annotation. On this framework, system developers can easily integrate their downstream system into a NLP pipeline from a raw text by just preparing a wrapper of it.


meeting of the association for computational linguistics | 2017

A* CCG Parsing with a Supertag and Dependency Factored Model.

Masashi Yoshikawa; Hiroshi Noji; Yuji Matsumoto

We propose a new A* CCG parsing model in which the probability of a tree is decomposed into factors of CCG categories and its syntactic dependencies both defined on bi-directional LSTMs. Our factored model allows the precomputation of all probabilities and runs very efficiently, while modeling sentence structures explicitly via dependencies. Our model achieves the state-of-the-art results on English and Japanese CCG parsing.


Proceedings of the CoNLL 2017 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies : August 3-4, 2017 Vancouver, Canada, 2017, ISBN 978-1-945626-70-8, págs. 71-79 | 2017

Adversarial Training for Cross-Domain Universal Dependency Parsing

Motoki Sato; Hitoshi Manabe; Hiroshi Noji; Yuji Matsumoto

We describe our submission to the CoNLL 2017 shared task, which exploits the shared common knowledge of a language across different domains via a domain adaptation technique. Our approach is an extension to the recently proposed adversarial training technique for domain adaptation, which we apply on top of a graph-based neural dependency parsing model on bidirectional LSTMs. In our experiments, we find our baseline graph-based parser already outperforms the official baseline model (UDPipe) by a large margin. Further, by applying our technique to the treebanks of the same lan- guage with different domains, we observe an additional gain in the performance, in particular for the domains with less training data


international conference on computational linguistics | 2014

Left-corner Transitions on Dependency Parsing

Hiroshi Noji; Yusuke Miyao


empirical methods in natural language processing | 2013

Improvements to the Bayesian Topic N-Gram Models

Hiroshi Noji; Daichi Mochihashi; Yusuke Miyao


conference of the european chapter of the association for computational linguistics | 2017

Multilingual Back-and-Forth Conversion between Content and Function Head for Easy Dependency Parsing.

Ryosuke Kohita; Hiroshi Noji; Yuji Matsumoto


arXiv: Computation and Language | 2016

Left-corner Methods for Syntactic Modeling with Universal Structural Constraints.

Hiroshi Noji


Journal of Information Processing | 2015

Left-corner Parsing for Dependency Grammar

Hiroshi Noji; Yusuke Miyao

Collaboration


Dive into the Hiroshi Noji's collaboration.

Top Co-Authors

Avatar

Yusuke Miyao

National Institute of Informatics

View shared research outputs
Top Co-Authors

Avatar

Yuji Matsumoto

Nara Institute of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Daichi Mochihashi

Nippon Telegraph and Telephone

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Frances Yung

Nara Institute of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Quy T. Nguyen

Graduate University for Advanced Studies

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge