Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ryan P. Adams is active.

Publication


Featured researches published by Ryan P. Adams.


Proceedings of the IEEE | 2016

Taking the Human Out of the Loop: A Review of Bayesian Optimization

Bobak Shahriari; Kevin Swersky; Ziyu Wang; Ryan P. Adams; Nando de Freitas

Big Data applications are typically associated with systems involving large numbers of users, massive complex software systems, and large-scale heterogeneous computing and storage architectures. The construction of such systems involves many distributed design choices. The end products (e.g., recommendation systems, medical analysis tools, real-time game engines, speech recognizers) thus involve many tunable configuration parameters. These parameters are often specified and hard-coded into the software by various developers or teams. If optimized jointly, these parameters can result in significant improvements. Bayesian optimization is a powerful tool for the joint optimization of design choices that is gaining great popularity in recent years. It promises greater automation so as to increase both product quality and human productivity. This review paper introduces Bayesian optimization, highlights some of its methodological aspects, and showcases a wide range of applications.


Science Translational Medicine | 2013

Quantitative Diagnosis of Malignant Pleural Effusions by Single-Cell Mechanophenotyping

Henry T. K. Tse; Daniel R. Gossett; Yo Sup Moon; Mahdokht Masaeli; Marie Sohsman; Yong Ying; Kimberly Mislick; Ryan P. Adams; Jianyu Rao; Dino Di Carlo

Single-cell biophysical properties were used for diagnosing malignant pleural effusions from patients. Cytometry Device Helps (De)form a Diagnosis Is it benign, or malignant? That is the main concern of cytopathologists as they screen cells in pleural effusions, taken from the lungs of patients suspected of having infections or cancer. This process is subjective and time-intensive and requires an expert’s eye. So, to quickly “prescreen” samples for malignancy (and follow-up), Tse et al. describe deformability cytometry (DC)—an approach that relies on microfluidic forces to diagnose pleural effusion samples as malignant, or not. The authors’ device accelerates effusion samples through two opposing microfluidic channels. At the channels’ four-way intersection, the cells are rapidly decelerated as they encounter the opposing flow, and then exit out the side channels. This leads to cell deformation, changing them from sphere-like shapes to pancakes. High-speed video of this intersection allowed Tse et al. to quantify cellular squishing: the more deformable the cell, the more malignant it is. The authors took 119 pleural effusion samples from patients with known clinical outcomes—negative for malignant cells (benign), acute inflammation, chronic/mixed inflammation, atypical cells, and malignant pleural effusions (MPEs)—to develop a diagnostic scoring system on a scale of 1 to 10, with 1 being benign. DC showed the best predictive abilities in two high-confidence regimes: 1 to 6 and 9 to 10. Scores of 7 and 8 were more difficult to diagnose, so these may be the types of samples where a cytopathologist’s initial input would be necessary. Importantly, the authors looked at samples from patients that were cytology-negative with concurrent malignancy, such as a tumor, but 6 months later were diagnosed with disseminated disease. Five of 10 patients with high-grade cancers that were cytology-negative at sample collection scored high using DC. This suggests that the DC tool could be used to screen early for MPE. Using deformability as a marker of disease will require additional validation in pleural effusion samples from patients with many different types of cancer. Nevertheless, owing to the ease of use and objective readout, with further clinical testing, DC should be useful as a quick screening tool to form an early diagnosis of MPEs. Biophysical characteristics of cells are attractive as potential diagnostic markers for cancer. Transformation of cell state or phenotype and the accompanying epigenetic, nuclear, and cytoplasmic modifications lead to measureable changes in cellular architecture. We recently introduced a technique called deformability cytometry (DC) that enables rapid mechanophenotyping of single cells in suspension at rates of 1000 cells/s—a throughput that is comparable to traditional flow cytometry. We applied this technique to diagnose malignant pleural effusions, in which disseminated tumor cells can be difficult to accurately identify by traditional cytology. An algorithmic diagnostic scoring system was developed on the basis of quantitative features of two-dimensional distributions of single-cell mechanophenotypes from 119 samples. The DC scoring system classified 63% of the samples into two high-confidence regimes with 100% positive predictive value or 100% negative predictive value, and achieved an area under the curve of 0.86. This performance is suitable for a prescreening role to focus cytopathologist analysis time on a smaller fraction of difficult samples. Diagnosis of samples that present a challenge to cytology was also improved. Samples labeled as “atypical cells,” which require additional time and follow-up, were classified in high-confidence regimes in 8 of 15 cases. Further, 10 of 17 cytology-negative samples corresponding to patients with concurrent cancer were correctly classified as malignant or negative, in agreement with 6-month outcomes. This study lays the groundwork for broader validation of label-free quantitative biophysical markers for clinical diagnoses of cancer and inflammation, which could help to reduce laboratory workload and improve clinical decision-making.


ACS central science | 2018

Automatic Chemical Design Using a Data-Driven Continuous Representation of Molecules

Rafael Gómez-Bombarelli; Jennifer Wei; David K. Duvenaud; José Miguel Hernández-Lobato; Benjamin Sanchez-Lengeling; Dennis Sheberla; Jorge Aguilera-Iparraguirre; Timothy D. Hirzel; Ryan P. Adams; Alán Aspuru-Guzik

We report a method to convert discrete representations of molecules to and from a multidimensional continuous representation. This model allows us to generate new molecules for efficient exploration and optimization through open-ended spaces of chemical compounds. A deep neural network was trained on hundreds of thousands of existing chemical structures to construct three coupled functions: an encoder, a decoder, and a predictor. The encoder converts the discrete representation of a molecule into a real-valued continuous vector, and the decoder converts these continuous vectors back to discrete molecular representations. The predictor estimates chemical properties from the latent continuous vector representation of the molecule. Continuous representations of molecules allow us to automatically generate novel chemical structures by performing simple operations in the latent space, such as decoding random vectors, perturbing known chemical structures, or interpolating between molecules. Continuous representations also allow the use of powerful gradient-based optimization to efficiently guide the search for optimized functional compounds. We demonstrate our method in the domain of drug-like molecules and also in a set of molecules with fewer that nine heavy atoms.


international conference on machine learning | 2009

Tractable nonparametric Bayesian inference in Poisson processes with Gaussian process intensities

Ryan P. Adams; Iain Murray; David J. C. MacKay

The inhomogeneous Poisson process is a point process that has varying intensity across its domain (usually time or space). For nonparametric Bayesian modeling, the Gaussian process is a useful way to place a prior distribution on this intensity. The combination of a Poisson process and GP is known as a Gaussian Cox process, or doubly-stochastic Poisson process. Likelihood-based inference in these models requires an intractable integral over an infinite-dimensional random function. In this paper we present the first approach to Gaussian Cox processes in which it is possible to perform inference without introducing approximations or finitedimensional proxy distributions. We call our method the Sigmoidal Gaussian Cox Process, which uses a generative model for Poisson data to enable tractable inference via Markov chain Monte Carlo. We compare our methods to competing methods on synthetic data and apply it to several real-world data sets.


Neuron | 2015

Mapping Sub-Second Structure in Mouse Behavior

Alexander B. Wiltschko; Matthew J. Johnson; Giuliano Iurilli; Ralph E. Peterson; Jesse M. Katon; Stan L. Pashkovski; Victoria E. Abraira; Ryan P. Adams; Sandeep Robert Datta

Complex animal behaviors are likely built from simpler modules, but their systematic identification in mammals remains a significant challenge. Here we use depth imaging to show that 3D mouse pose dynamics are structured at the sub-second timescale. Computational modeling of these fast dynamics effectively describes mouse behavior as a series of reused and stereotyped modules with defined transition probabilities. We demonstrate this combined 3D imaging and machine learning method can be used to unmask potential strategies employed by the brain to adapt to the environment, to capture both predicted and previously hidden phenotypes caused by genetic or neural manipulations, and to systematically expose the global structure of behavior within an experiment. This work reveals that mouse body language is built from identifiable components and is organized in a predictable fashion; deciphering this language establishes an objective framework for characterizing the influence of environmental cues, genes and neural activity on behavior.


international conference on machine learning | 2008

Gaussian process product models for nonparametric nonstationarity

Ryan P. Adams; Oliver Stegle

Stationarity is often an unrealistic prior assumption for Gaussian process regression. One solution is to predefine an explicit nonstationary covariance function, but such covariance functions can be difficult to specify and require detailed prior knowledge of the nonstationarity. We propose the Gaussian process product model (GPPM) which models data as the pointwise product of two latent Gaussian processes to nonparametrically infer nonstationary variations of amplitude. This approach differs from other nonparametric approaches to covariance function inference in that it operates on the outputs rather than the inputs, resulting in a significant reduction in computational cost and required data for inference. We present an approximate inference scheme using Expectation Propagation. This variational approximation yields convenient GP hyperparameter selection and compact approximate predictive distributions.


computer vision and pattern recognition | 2012

Revisiting uncertainty in graph cut solutions

Daniel Tarlow; Ryan P. Adams

Graph cuts is a popular algorithm for finding the MAP assignment of many large-scale graphical models that are common in computer vision. While graph cuts is powerful, it does not provide information about the marginal probabilities associated with the solution it finds. To assess uncertainty, we are forced to fall back on less efficient and inexact inference algorithms such as loopy belief propagation, or use less principled surrogate representations of uncertainty such as the min-marginal approach of Kohli & Torr [8]. In this work, we give new justification for using min-marginals to compute the uncertainty in conditional random fields, framing the min-marginal outputs as exact marginals under a specially-chosen generative probabilistic model. We leverage this view to learn properly calibrated marginal probabilities as the result of straightforward maximization of the training likelihood, showing that the necessary subgradients can be computed efficiently using dynamic graph cut operations. We also show how this approach can be extended to compute multi-label marginal distributions, where again dynamic graph cuts enable efficient marginal inference and maximum likelihood learning. We demonstrate empirically that - after proper training - uncertainties based on min-marginals provide better-calibrated probabilities than baselines and that these distributions can be exploited in a decision-theoretic way for improved segmentation in low-level vision.


Journal of Machine Learning Research | 2016

A general framework for constrained Bayesian optimization using information-based search

José Miguel Hernández-Lobato; Michael A. Gelbart; Ryan P. Adams; Matthew W. Hoffman; Zoubin Ghahramani

We present an information-theoretic framework for solving global black-box optimization problems that also have black-box constraints. Of particular interest to us is to efficiently solve problems with decoupled constraints, in which subsets of the objective and constraint functions may be evaluated independently. For example, when the objective is evaluated on a CPU and the constraints are evaluated independently on a GPU. These problems require an acquisition function that can be separated into the contributions of the individual function evaluations. We develop one such acquisition function and call it Predictive Entropy Search with Constraints (PESC). PESC is an approximation to the expected information gain criterion and it compares favorably to alternative approaches based on improvement in several synthetic and real-world problems. In addition to this, we consider problems with a mix of functions that are fast and slow to evaluate. These problems require balancing the amount of time spent in the meta-computation of PESC and in the actual evaluation of the target objective. We take a bounded rationality approach and develop a partial update for PESC which trades o_ accuracy against speed. We then propose a method for adaptively switching between the partial and full updates for PESC. This allows us to interpolate between versions of PESC that are efficient in terms of function evaluations and those that are efficient in terms of wall-clock time. Overall, we demonstrate that PESC is an effective algorithm that provides a promising direction towards a unified solution for constrained Bayesian optimization.


arXiv: Machine Learning | 2016

Patterns of Scalable Bayesian Inference

Elaine Angelino; Matthew J. Johnson; Ryan P. Adams

Datasets are growing not just in size but in complexity, creating a demand for rich models and quantification of uncertainty. Bayesian methods are an excellent fit for this demand, but scaling Bayesian inference is a challenge. In response to this challenge, there has been considerable recent work based on varying assumptions about model structure, underlying computational resources, and the importance of asymptotic correctness. As a result, there is a zoo of ideas with few clear overarching principles. In this paper, we seek to identify unifying principles, patterns, and intuitions for scaling Bayesian inference. We review existing work on utilizing modern computing resources with both MCMC and variational approximation techniques. From this taxonomy of ideas, we characterize the general principles that have proven successful for designing scalable inference procedures and comment on the path forward.


international conference on machine learning | 2009

Archipelago: nonparametric Bayesian semi-supervised learning

Ryan P. Adams; Zoubin Ghahramani

Semi-supervised learning (SSL), is classification where additional unlabeled data can be used to improve accuracy. Generative approaches are appealing in this situation, as a model of the datas probability density can assist in identifying clusters. Nonparametric Bayesian methods, while ideal in theory due to their principled motivations, have been difficult to apply to SSL in practice. We present a nonparametric Bayesian method that uses Gaussian processes for the generative model, avoiding many of the problems associated with Dirichlet process mixture models. Our model is fully generative and we take advantage of recent advances in Markov chain Monte Carlo algorithms to provide a practical inference method. Our method compares favorably to competing approaches on synthetic and real-world multi-class data.

Collaboration


Dive into the Ryan P. Adams's collaboration.

Top Co-Authors

Avatar

Iain Murray

University of Edinburgh

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hugo Larochelle

Université de Sherbrooke

View shared research outputs
Researchain Logo
Decentralizing Knowledge