Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Adam Trischler is active.

Publication


Featured researches published by Adam Trischler.


meeting of the association for computational linguistics | 2017

NEWSQA: A MACHINE COMPREHENSION DATASET

Adam Trischler; Tong Wang; Xingdi Yuan; Justin Harris; Alessandro Sordoni; Philip Bachman; Kaheer Suleman

We present NewsQA, a challenging machine comprehension dataset of over 100,000 human-generated question-answer pairs. Crowdworkers supply questions and answers based on a set of over 10,000 news articles from CNN, with answers consisting of spans of text in the articles. We collect this dataset through a four-stage process designed to solicit exploratory questions that require reasoning. Analysis confirms that NewsQA demands abilities beyond simple word matching and recognizing textual entailment. We measure human performance on the dataset and compare it to several strong neural models. The performance gap between humans and machines (13.3% F1) indicates that significant progress can be made on NewsQA through future research. The dataset is freely available online.


empirical methods in natural language processing | 2016

Natural Language Comprehension with the EpiReader.

Adam Trischler; Zheng Ye; Xingdi Yuan; Philip Bachman; Alessandro Sordoni; Kaheer Suleman

We present the EpiReader, a novel model for machine comprehension of text. Machine comprehension of unstructured, real-world text is a major research goal for natural language processing. Current tests of machine comprehension pose questions whose answers can be inferred from some supporting text, and evaluate a models response to the questions. The EpiReader is an end-to-end neural model comprising two components: the first component proposes a small set of candidate answers after comparing a question to its supporting text, and the second component formulates hypotheses using the proposed candidates and the question, then reranks the hypotheses based on their estimated concordance with the supporting text. We present experiments demonstrating that the EpiReader sets a new state-of-the-art on the CNN and Childrens Book Test machine comprehension benchmarks, outperforming previous neural models by a significant margin.


meeting of the association for computational linguistics | 2016

A Parallel-Hierarchical Model for Machine Comprehension on Sparse Data

Adam Trischler; Zheng Ye; Xingdi Yuan; Jing He; Philip Bachman

Understanding unstructured text is a major goal within natural language processing. Comprehension tests pose questions based on short text passages to evaluate such understanding. In this work, we investigate machine comprehension on the challenging {\it MCTest} benchmark. Partly because of its limited size, prior work on {\it MCTest} has focused mainly on engineering better features. We tackle the dataset with a neural approach, harnessing simple neural networks arranged in a parallel hierarchy. The parallel hierarchy enables our model to compare the passage, question, and answer from a variety of trainable perspectives, as opposed to using a manually designed, rigid feature set. Perspectives range from the word level to sentence fragments to sequences of sentences; the networks operate only on word-embedding representations of text. When trained with a methodology designed to help cope with limited training data, our Parallel-Hierarchical model sets a new state of the art for {\it MCTest}, outperforming previous feature-engineered approaches slightly and previous neural approaches by a significant margin (over 15\% absolute).


meeting of the association for computational linguistics | 2017

Machine Comprehension by Text-to-Text Neural Question Generation

Xingdi Yuan; Tong Wang; Caglar Gulcehre; Alessandro Sordoni; Philip Bachman; Saizheng Zhang; Sandeep Subramanian; Adam Trischler

We propose a recurrent neural model that generates natural-language questions from documents, conditioned on answers. We show how to train the model using a combination of supervised and reinforcement learning. After teacher forcing for standard maximum likelihood training, we fine-tune the model using policy gradient techniques to maximize several rewards that measure question quality. Most notably, one of these rewards is the performance of a question-answering system. We motivate question generation as a means to improve the performance of question answering systems. Our model is trained and evaluated on the recent question-answering dataset SQuAD.


Neural Networks | 2016

Synthesis of recurrent neural networks for dynamical system simulation

Adam Trischler; Gabriele M. T. D'Eleuterio

We review several of the most widely used techniques for training recurrent neural networks to approximate dynamical systems, then describe a novel algorithm for this task. The algorithm is based on an earlier theoretical result that guarantees the quality of the network approximation. We show that a feedforward neural network can be trained on the vector-field representation of a given dynamical system using backpropagation, then recast it as a recurrent network that replicates the original systems dynamics. After detailing this algorithm and its relation to earlier approaches, we present numerical examples that demonstrate its capabilities. One of the distinguishing features of our approach is that both the original dynamical systems and the recurrent networks that simulate them operate in continuous time.


international conference on learning representations | 2018

Learning General Purpose Distributed Sentence Representations via Large Scale Multi-task Learning

Sandeep Subramanian; Adam Trischler; Yoshua Bengio; Chris Pal


international conference on machine learning | 2017

Learning Algorithms for Active Learning.

Philip Bachman; Alessandro Sordoni; Adam Trischler


arXiv: Computation and Language | 2017

A Joint Model for Question Answering and Question Generation.

Tong Wang; Xingdi Yuan; Adam Trischler


meeting of the association for computational linguistics | 2017

Neural Models for Key Phrase Extraction and Question Generation.

Sandeep Subramanian; Tong Wang; Xingdi Yuan; Saizheng Zhang; Adam Trischler; Yoshua Bengio


international conference on machine learning | 2018

Rapid Adaptation with Conditionally Shifted Neurons

Tsendsuren Munkhdalai; Xingdi Yuan; Soroush Mehri; Adam Trischler

Collaboration


Dive into the Adam Trischler's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yoshua Bengio

Université de Montréal

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chris Pal

École Polytechnique de Montréal

View shared research outputs
Top Co-Authors

Avatar

Saizheng Zhang

Université de Montréal

View shared research outputs
Top Co-Authors

Avatar

Francis Dutil

Université de Sherbrooke

View shared research outputs
Researchain Logo
Decentralizing Knowledge