Kevin J. Lang
Carnegie Mellon University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Kevin J. Lang.
IEEE Transactions on Acoustics, Speech, and Signal Processing | 1989
Alex Waibel; Toshiyuki Hanazawa; Geoffrey E. Hinton; Kiyohiro Shikano; Kevin J. Lang
The authors present a time-delay neural network (TDNN) approach to phoneme recognition which is characterized by two important properties: (1) using a three-layer arrangement of simple computing units, a hierarchy can be constructed that allows for the formation of arbitrary nonlinear decision surfaces, which the TDNN learns automatically using error backpropagation; and (2) the time-delay arrangement enables the network to discover acoustic-phonetic features and the temporal relationships between them independently of position in time and therefore not blurred by temporal shifts in the input. As a recognition task, the speaker-dependent recognition of the phonemes B, D, and G in varying phonetic contexts was chosen. For comparison, several discrete hidden Markov models (HMM) were trained to perform the same task. Performance evaluation over 1946 testing tokens from three speakers showed that the TDNN achieves a recognition rate of 98.5% correct while the rate obtained by the best of the HMMs was only 93.7%. >
Neural Networks | 1990
Kevin J. Lang; Alex Waibel; Geoffrey E. Hinton
Abstract A translation-invariant back-propagation network is described that performs better than a sophisticated continuous acoustic parameter hidden Markov model on a noisy, 100-speaker confusable vocabulary isolated word recognition task. The networks replicated architecture permits it to extract precise information from unaligned training patterns selected by a naive segmentation rule.
international colloquium on grammatical inference | 1998
Kevin J. Lang; Barak A. Pearlmutter; Rodney A. Price
This paper first describes the structure and results of the Abbadingo One DFA Learning Competition. The competition was designed to encourage work on algorithms that scale well—both to larger DFAs and to sparser training data. We then describe and discuss the winning algorithm of Rodney Price, which orders state merges according to the amount of evidence in their favor. A second winning algorithm, of Hugues Juille, will be described in a separate paper.
conference on object oriented programming systems languages and applications | 1986
Kevin J. Lang; Barak A. Pearlmutter
The Scheme papers demonstrated that lisp could be made simpler and more expressive by elevating functions to the level of first class objects. Oaklisp shows that a message based language can derive similar benefits from having first class types.
Higher-order and Symbolic Computation \/ Lisp and Symbolic Computation | 1988
Kevin J. Lang; Barak A. Pearlmutter
This paper contains a description of Oaklisp, a dialect of Lisp incorporating lexical scoping, multiple inheritance, and first-class types. This description is followed by a revisionist history of the Oaklisp design, in which a crude map of the space of object-oriented Lisps is drawn and some advantages of first-class types are explored. Scoping issues are discussed, with a particular emphasis on instance variables and top-level namespaces. The question of which should come first, the lambda or the object, is addressed, with Oaklisp providing support for the latter approach.
conference on learning theory | 1994
Joe Kilian; Kevin J. Lang; Barak A. Pearlmutter
The best previous algorithm for the matching shoulders lob-pass game, ARTHUR (Abe and Takeuchi 1993), suffered <italic>O</italic>(<italic>t</italic><supscrpt>1/2</supscrpt>) regret. We prove that this is the best possible performance for any algorithm that works by accurately estimating the opponents payoff lines. Then we describe an algorithm which beats that bound and meets the information-theoretic lower bound of O(log<italic>t</italic>) regret by converging to the best lob rate <italic>without</italic> accurately estimating the payoff lines. The noise-tolerant binary search procedure that we develop is of independent interest.
international joint conference on artificial intelligence | 1985
Geoffrey E. Hinton; Kevin J. Lang
neural information processing systems | 1989
Kevin J. Lang; Geoffrey E. Hinton
Archive | 1997
Joe Kilian; Kevin J. Lang
Archive | 1991
Barak A. Pearlmutter; Kevin J. Lang