Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kenneth Heafield is active.

Publication


Featured researches published by Kenneth Heafield.


The Prague Bulletin of Mathematical Linguistics | 2010

Combining Machine Translation Output with Open Source The Carnegie Mellon Multi-Engine Machine Translation Scheme

Kenneth Heafield; Alon Lavie

Combining Machine Translation Output with Open Source: The Carnegie Mellon Multi-Engine Machine Translation Scheme The Carnegie Mellon multi-engine machine translation software merges output from several machine translation systems into a single improved translation. This improvement is significant: in the recent NIST MT09 evaluation, the combined Arabic-English output scored 5.22 BLEU points higher than the best individual system. Concurrent with this paper, we release the source code behind this result consisting of a recombining beam search decoder, the combination search space and features, and several accessories. Here we describe how the released software works and its use.


workshop on statistical machine translation | 2009

Machine Translation System Combination with Flexible Word Ordering

Kenneth Heafield; Greg Hanneman; Alon Lavie

We describe a synthetic method for combining machine translations produced by different systems given the same input. One-best outputs are explicitly aligned to remove duplicate words. Hypotheses follow system outputs in sentence order, switching between systems mid-sentence to produce a combined output. Experiments with the WMT 2009 tuning data showed improvement of 2 BLEU and 1 METEOR point over the best Hungarian-English system. Constrained to data provided by the contest, our system was submitted to the WMT 2009 shared system combination task.


workshop on statistical machine translation | 2014

Edinburgh’s Phrase-based Machine Translation Systems for WMT-14

Nadir Durrani; Barry Haddow; Philipp Koehn; Kenneth Heafield

This paper describes the University of Edinburgh’s (UEDIN) phrase-based submissions to the translation and medical translation shared tasks of the 2014 Workshop on Statistical Machine Translation (WMT). We participated in all language pairs. We have improved upon our 2013 system by i) using generalized representations, specifically automatic word clusters for translations out of English, ii) using unsupervised character-based models to translate unknown words in RussianEnglish and Hindi-English pairs, iii) synthesizing Hindi data from closely-related Urdu data, and iv) building huge language on the common crawl corpus.


Proceedings of the Second Conference on Machine Translation | 2017

The University of Edinburgh's Neural MT Systems for WMT17

Rico Sennrich; Alexandra Birch; Anna Currey; Ulrich Germann; Barry Haddow; Kenneth Heafield; Antonio Barone; Philip Williams

This paper describes the University of Edinburghs submissions to the WMT17 shared news translation and biomedical translation tasks. We participated in 12 translation directions for news, translating between English and Czech, German, Latvian, Russian, Turkish and Chinese. For the biomedical task we submitted systems for English to Czech, German, Polish and Romanian. Our systems are neural machine translation systems trained with Nematus, an attentional encoder-decoder. We follow our setup from last year and build BPE-based models with parallel and back-translated monolingual training data. Novelties this year include the use of deep architectures, layer normalization, and more compact models due to weight tying and improvements in BPE segmentations. We perform extensive ablative experiments, reporting on the effectivenes of layer normalization, deep architectures, and different ensembling techniques.


The Prague Bulletin of Mathematical Linguistics | 2010

The Machine Translation Toolpack for LoonyBin: Automated Management of Experimental Machine Translation HyperWorkflows

Jonathan H. Clark; Jonathan Weese; Byung Gyu Ahn; Andreas Zollmann; Qin Gao; Kenneth Heafield; Alon Lavie

The Machine Translation Toolpack for LoonyBin: Automated Management of Experimental Machine Translation HyperWorkflows Construction of machine translation systems has evolved into a multi-stage workflow involving many complicated dependencies. Many decoder distributions have addressed this by including monolithic training scripts - train-factored-model.pl for Moses and mr_runmer.pl for SAMT. However, such scripts can be tricky to modify for novel experiments and typically have limited support for the variety of job schedulers found on academic and commercial computer clusters. Further complicating these systems are hyperparameters, which often cannot be directly optimized by conventional methods requiring users to determine which combination of values is best via trial and error. The recently-released LoonyBin open-source workflow management tool addresses these issues by providing: 1) a visual interface for the user to create and modify workflows; 2) a well-defined logging mechanism; 3) a script generator that compiles visual workflows into shell scripts, and 4) the concept of Hyperworkflows, which intuitively and succinctly encodes small experimental variations within a larger workflow. In this paper, we describe the Machine Translation Toolpack for LoonyBin, which exposes state-of-the-art machine translation tools as drag-and-drop components within LoonyBin.


workshop on statistical machine translation | 2014

Stanford University’s Submissions to the WMT 2014 Translation Task

Julia Neidert; Sebastian Schuster; Spence Green; Kenneth Heafield; Christopher D. Manning

We describe Stanford’s participation in the French-English and English-German tracks of the 2014 Workshop on Statistical Machine Translation (WMT). Our systems used large feature sets, word classes, and an optional unconstrained language model. Among constrained systems, ours performed the best according to uncased BLEU: 36.0% for French-English and 20.9% for English-German.


meeting of the association for computational linguistics | 2016

Normalized Log-Linear Interpolation of Backoff Language Models is Efficient

Kenneth Heafield; Chase Geigle; Sean Massung; Lane Schwartz

We prove that log-linearly interpolated backoff language models can be efficiently and exactly collapsed into a single normalized backoff model, contradicting Hsu (2007). While prior work reported that log-linear interpolation yields lower perplexity than linear interpolation, normalizing at query time was impractical. We normalize the model offline in advance, which is efficient due to a recurrence relationship between the normalizing factors. To tune interpolation weights, we apply Newton’s method to this convex problem and show that the derivatives can be computed efficiently in a batch process. These findings are combined in new open-source interpolation tool, which is distributed with KenLM. With 21 out-of-domain corpora, log-linear interpolation yields 72.58 perplexity on TED talks, compared to 75.91 for linear interpolation.


workshop on statistical machine translation | 2011

KenLM: Faster and Smaller Language Model Queries

Kenneth Heafield


meeting of the association for computational linguistics | 2013

Scalable Modified Kneser-Ney Language Model Estimation

Kenneth Heafield; Ivan Pouzyrevsky; Jonathan H. Clark; Philipp Koehn


workshop on statistical machine translation | 2013

Edinburgh's Machine Translation Systems for European Language Pairs

Nadir Durrani; Barry Haddow; Kenneth Heafield; Philipp Koehn

Collaboration


Dive into the Kenneth Heafield's collaboration.

Top Co-Authors

Avatar

Alon Lavie

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Barry Haddow

University of Edinburgh

View shared research outputs
Top Co-Authors

Avatar

Hieu Hoang

University of Edinburgh

View shared research outputs
Top Co-Authors

Avatar

Marcin Junczys-Dowmunt

Adam Mickiewicz University in Poznań

View shared research outputs
Top Co-Authors

Avatar

Roman Grundkiewicz

Adam Mickiewicz University in Poznań

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jonathan H. Clark

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge