Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kevin Clark is active.

Publication


Featured researches published by Kevin Clark.


international joint conference on natural language processing | 2015

Entity-Centric Coreference Resolution with Model Stacking

Kevin Clark; Christopher D. Manning

Mention pair models that predict whether or not two mentions are coreferent have historically been very effective for coreference resolution, but do not make use of entity-level information. However, we show that the scores produced by such models can be aggregated to define powerful entity-level features between clusters of mentions. Using these features, we train an entity-centric coreference system that learns an effective policy for building up coreference chains incrementally. The mention pair scores are also used to prune the search space the system works in, allowing for efficient training with an exact loss function. We evaluate our system on the English portion of the 2012 CoNLL Shared Task dataset and show that it improves over the current state of the art.


meeting of the association for computational linguistics | 2016

Improving Coreference Resolution by Learning Entity-Level Distributed Representations

Kevin Clark; Christopher D. Manning

A long-standing challenge in coreference resolution has been the incorporation of entity-level information - features defined over clusters of mentions instead of mention pairs. We present a neural network based coreference system that produces high-dimensional vector representations for pairs of coreference clusters. Using these representations, our system learns when combining clusters is desirable. We train the system with a learning-to-search algorithm that teaches it which local decisions (cluster merges) will lead to a high-scoring final coreference partition. The system substantially outperforms the current state-of-the-art on the English and Chinese portions of the CoNLL 2012 Shared Task dataset despite using few hand-engineered features.


empirical methods in natural language processing | 2016

Deep Reinforcement Learning for Mention-Ranking Coreference Models

Kevin Clark; Christopher D. Manning

Coreference resolution systems are typically trained with heuristic loss functions that require careful tuning. In this paper we instead apply reinforcement learning to directly optimize a neural mention-ranking model for coreference evaluation metrics. We experiment with two approaches: the REINFORCE policy gradient algorithm and a reward-rescaled max-margin objective. We find the latter to be more effective, resulting in significant improvements over the current state-of-the-art on the English and Chinese portions of the CoNLL 2012 Shared Task.


empirical methods in natural language processing | 2016

Inducing Domain-Specific Sentiment Lexicons from Unlabeled Corpora.

William L. Hamilton; Kevin Clark; Jure Leskovec; Daniel Jurafsky

A words sentiment depends on the domain in which it is used. Computational social science research thus requires sentiment lexicons that are specific to the domains being studied. We combine domain-specific word embeddings with a label propagation framework to induce accurate domain-specific sentiment lexicons using small sets of seed words. We show that our approach achieves state-of-the-art performance on inducing sentiment lexicons from domain-specific corpora and that our purely corpus-based approach outperforms methods that rely on hand-curated resources (e.g., WordNet). Using our framework, we induce and release historical sentiment lexicons for 150 years of English and community-specific sentiment lexicons for 250 online communities from the social media forum Reddit. The historical lexicons we induce show that more than 5% of sentiment-bearing (non-neutral) English words completely switched polarity during the last 150 years, and the community-specific lexicons highlight how sentiment varies drastically between different communities.


user interface software and technology | 2012

RevMiner: an extractive interface for navigating reviews on a smartphone

Jeff Huang; Oren Etzioni; Luke Zettlemoyer; Kevin Clark; Christian Lee


Transactions of the Association for Computational Linguistics | 2016

Large-scale Analysis of Counseling Conversations: An Application of Natural Language Processing to Mental Health

Tim Althoff; Kevin Clark; Jure Leskovec


arXiv: Computation and Language | 2016

Natural Language Processing for Mental Health: Large Scale Discourse Analysis of Counseling Conversations.

Tim Althoff; Kevin Clark; Jure Leskovec


empirical methods in natural language processing | 2018

Semi-Supervised Sequence Modeling with Cross-View Training

Kevin Clark; Minh-Thang Luong; Christopher D. Manning; Quoc V. Le


Theory and Applications of Categories | 2017

TinkerBell: Cross-lingual Cold-Start Knowledge Base Construction.

Mohamed Al-Badrashiny; Jason Bolton; Arun Tejasvi Chaganty; Kevin Clark; Craig Harman; Lifu Huang; Matthew Lamm; Jinhao Lei; Di Lu; Xiaoman Pan; Ashwin Paranjape; Ellie Pavlick; Haoruo Peng; Peng Qi; Pushpendre Rastogi; Abigail See; Kai Sun; Max Thomas; Chen-Tse Tsai; Hao Wu; Boliang Zhang; Chris Callison-Burch; Claire Cardie; Heng Ji; Christopher D. Manning; Smaranda Muresan; Owen Rambow; Dan Roth; Mark Sammons; Benjamin Van Durme


international conference on learning representations | 2018

Cross-View Training for Semi-Supervised Learning

Kevin Clark; Thang Luong; Quoc V. Le

Collaboration


Dive into the Kevin Clark's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Boliang Zhang

Rensselaer Polytechnic Institute

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Christian Lee

University of Washington

View shared research outputs
Researchain Logo
Decentralizing Knowledge