Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jakob Uszkoreit is active.

Publication


Featured researches published by Jakob Uszkoreit.


empirical methods in natural language processing | 2008

Lattice-based Minimum Error Rate Training for Statistical Machine Translation

Wolfgang Macherey; Franz Josef Och; Ignacio E. Thayer; Jakob Uszkoreit

Minimum Error Rate Training (MERT) is an effective means to estimate the feature function weights of a linear model such that an automated evaluation criterion for measuring system performance can directly be optimized in training. To accomplish this, the training procedure determines for each feature function its exact error surface on a given set of candidate translations. The feature function weights are then adjusted by traversing the error surface combined over all sentences and picking those values for which the resulting error count reaches a minimum. Typically, candidates in MERT are represented as N-best lists which contain the N most probable translation hypotheses produced by a decoder. In this paper, we present a novel algorithm that allows for efficiently constructing and representing the exact error surface of all translations that are encoded in a phrase lattice. Compared to N-best MERT, the number of candidate translations thus taken into account increases by several orders of magnitudes. The proposed method is used to train the feature function weights of a phrase-based statistical machine translation system. Experiments conducted on the NIST 2008 translation tasks show significant runtime improvements and moderate BLEU score gains over N-best MERT.


empirical methods in natural language processing | 2016

A Decomposable Attention Model for Natural Language Inference

Ankur P. Parikh; Oscar Täckström; Dipanjan Das; Jakob Uszkoreit

We propose a simple neural architecture for natural language inference. Our approach uses attention to decompose the problem into subproblems that can be solved separately, thus making it trivially parallelizable. On the Stanford Natural Language Inference (SNLI) dataset, we obtain state-of-the-art results with almost an order of magnitude fewer parameters than previous work and without relying on any word-order information. Adding intra-sentence attention that takes a minimum amount of order into account yields further improvements.


meeting of the association for computational linguistics | 2017

Coarse-to-Fine Question Answering for Long Documents

Eunsol Choi; Daniel Hewlett; Jakob Uszkoreit; Illia Polosukhin; Alexandre Lacoste; Jonathan Berant

We present a framework for question answering that can efficiently scale to longer documents while maintaining or even improving performance of state-of-the-art models. While most successful approaches for reading comprehension rely on recurrent neural networks (RNNs), running them over long documents is prohibitively slow because it is difficult to parallelize over sequences. Inspired by how people first skim the document, identify relevant parts, and carefully read these parts to produce an answer, we combine a coarse, fast model for selecting relevant sentences and a more expensive RNN for producing the answer from those sentences. We treat sentence selection as a latent variable trained jointly from the answer only using reinforcement learning. Experiments demonstrate state-of-the-art performance on a challenging subset of the WikiReading dataset and on a new dataset, while speeding up the model by 3.5x-6.7x.


neural information processing systems | 2017

Attention is All you Need

Ashish Vaswani; Noam Shazeer; Niki Parmar; Jakob Uszkoreit; Llion Jones; Aidan N. Gomez; Lukasz Kaiser; Illia Polosukhin


north american chapter of the association for computational linguistics | 2012

Cross-lingual Word Clusters for Direct Transfer of Linguistic Structure

Oscar Täckström; Ryan T. McDonald; Jakob Uszkoreit


international conference on computational linguistics | 2010

Large Scale Parallel Document Mining for Machine Translation

Jakob Uszkoreit; Jay Ponte; Ashok C. Popat; Moshe Dubiner


meeting of the association for computational linguistics | 2008

Distributed Word Clustering for Large Scale Class-Based Language Modeling in Machine Translation

Jakob Uszkoreit; Thorsten Brants


empirical methods in natural language processing | 2011

Inducing Sentence Structure from Parallel Corpora for Reordering

John DeNero; Jakob Uszkoreit


empirical methods in natural language processing | 2010

Poetic Statistical Machine Translation: Rhyme and Meter

Dmitriy Genzel; Jakob Uszkoreit; Franz Josef Och


arXiv: Learning | 2017

One Model To Learn Them All

Lukasz Kaiser; Aidan N. Gomez; Noam Shazeer; Ashish Vaswani; Niki Parmar; Llion Jones; Jakob Uszkoreit

Collaboration


Dive into the Jakob Uszkoreit's collaboration.

Researchain Logo
Decentralizing Knowledge