Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jason Riggle is active.

Publication


Featured researches published by Jason Riggle.


Linguistic Inquiry | 2009

Evaluating the Complexity of Optimality Theory

Jeffrey Heinz; Gregory M. Kobele; Jason Riggle

Idsardi (2006) claims that Optimality Theory (OT; Prince and Smolensky 1993, 2004) is in general computationally intractable on the basis of a proof adapted from Eisner 1997a. We take issue with this conclusion on two grounds. First, the intractability result holds only in cases where the constraint set is not fixed in advance (contra usual definitions of OT), and second, the result crucially depends on a particular representation of OT grammars. We show that there is an alternative representation of OT grammars that allows for efficient computation of optimal surface forms and provides deeper insight into the sources of complexity of OT. We conclude that it is a mistake to reject OT on the grounds that it is computationally intractable.


Computational Linguistics | 2009

The complexity of ranking hypotheses in optimality theory

Jason Riggle

Given a constraint set with k constraints in the framework of Optimality Theory (OT), what is its capacity as a classification scheme for linguistic data? One useful measure of this capacity is the size of the largest data set of which each subset is consistent with a different grammar hypothesis. This measure is known as the Vapnik-Chervonenkis dimension (VCD) and is a standard complexity measure for concept classes in computational learnability theory. In this work, I use the three-valued logic of Elementary Ranking Conditions to show that the VCD of Optimality Theory with k constraints is k-1. Analysis of OT in terms of the VCD establishes that the complexity of OT is a well-behaved function of k and that the hardness of learning in OT is linear in k for a variety of frameworks that employ probabilistic definitions of learnability.


Computer Speech & Language | 2017

Lexicon-free fingerspelling recognition from video: Data, models, and signer adaptation

Taehwan Kim; Jonathan Keane; Weiran Wang; Hao Tang; Jason Riggle; Gregory Shakhnarovich; Diane Brentari; Karen Livescu

We study the problem of recognizing video sequences of fingerspelled letters in American Sign Language (ASL). Fingerspelling comprises a significant but relatively understudied part of ASL. Recognizing fingerspelling is challenging for a number of reasons: It involves quick, small motions that are often highly coarticulated; it exhibits significant variation between signers; and there has been a dearth of continuous fingerspelling data collected. In this work we collect and annotate a new data set of continuous fingerspelling videos, compare several types of recognizers, and explore the problem of signer variation. Our best-performing models are segmental (semi-Markov) conditional random fields using deep neural network-based features. In the signer-dependent setting, our recognizers achieve up to about 92% letter accuracy. The multi-signer setting is much more challenging, but with neural network adaptation we achieve up to 83% letter accuracies in this setting.


meeting of the association for computational linguistics | 2008

Counting rankings: invited talk

Jason Riggle

In this talk, I present a recursive algorithm to calculate the number of rankings that are consistent with a set of data (optimal candidates) in the framework of Optimality Theory (OT; Prince and Smolensky 1993). Computing this quantity, which I call r-volume, makes possible a simple and effective Bayesian heuristic in learning -- all else equal, choose candidates that are preferred by the highest number of rankings consistent with previous observations. This heuristic yields an r-volume learning algorithm (RVL) that is guaranteed to make fewer than k lg k errors while learning rankings of k constraints. This log-linear error bound is an improvement over the quadratic bound of Recursive Constraint Demotion (RCD; Tesar and Smolensky 1996) and it is within a logarithmic factor of the best possible mistake bound for any OT learning algorithm.


Archive | 2011

The handbook of phonological theory

John Goldsmith; Jason Riggle; Alan C. L. Yu


Archive | 2011

The Handbook of Phonological Theory: Goldsmith/The Handbook of Phonological Theory

John Goldsmith; Jason Riggle; Alan C. L. Yu


Natural Language and Linguistic Theory | 2012

Information theoretic approaches to phonological structure: the case of Finnish vowel harmony

John Goldsmith; Jason Riggle


Natural Language and Linguistic Theory | 2006

Infixing reduplication in Pima and its theoretical consequences

Jason Riggle


Lingua | 2010

The VC dimension of constraint-based grammars

Max Bane; Jason Riggle; Morgan Sonderegger


meeting of the association for computational linguistics | 2008

Three Correlates of the Typological Frequency of Quantity-Insensitive Stress Systems

Max Bane; Jason Riggle

Collaboration


Dive into the Jason Riggle's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Max Bane

University of Chicago

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yoosook Lee

University of California

View shared research outputs
Top Co-Authors

Avatar

Yuan Yao

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge