Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sinead A. Williamson is active.

Publication


Featured researches published by Sinead A. Williamson.


international conference on machine learning | 2008

Statistical models for partial membership

Katherine A. Heller; Sinead A. Williamson; Zoubin Ghahramani

We present a principled Bayesian framework for modeling partial memberships of data points to clusters. Unlike a standard mixture model which assumes that each data point belongs to one and only one mixture component, or cluster, a partial membership model allows data points to have fractional membership in multiple clusters. Algorithms which assign data points partial memberships to clusters can be useful for tasks such as clustering genes based on microarray data (Gasch & Eisen, 2002). Our Bayesian Partial Membership Model (BPM) uses exponential family distributions to model each cluster, and a product of these distibtutions, with weighted parameters, to model each datapoint. Here the weights correspond to the degree to which the datapoint belongs to each cluster. All parameters in the BPM are continuous, so we can use Hybrid Monte Carlo to perform inference and learning. We discuss relationships between the BPM and Latent Dirichlet Allocation, Mixed Membership models, Exponential Family PCA, and fuzzy clustering. Lastly, we show some experimental results and discuss nonparametric extensions to our model.


siam international conference on data mining | 2013

A nonparametric mixture model for topic modeling over time

Avinava Dubey; Ahmed Hefny; Sinead A. Williamson; Eric P. Xing

A single, stationary topic model such as latent Dirichlet allocation is inappropriate for modeling corpora that span long time periods, as the popularity of topics is likely to change over time. A number of models that incorporate time have been proposed, but in general they either exhibit limited forms of temporal variation, or require computationally expensive inference methods. In this paper we propose nonparametric Topics over Time (npTOT), a model for time-varying topics that allows an unbounded number of topics and flexible distribution over the temporal variations in those topics’ popularity. We develop a collapsed Gibbs sampler for the proposed model and compare against existing models on synthetic and real document sets.


IEEE Transactions on Pattern Analysis and Machine Intelligence | 2015

A Survey of Non-Exchangeable Priors for Bayesian Nonparametric Models

Nicholas J. Foti; Sinead A. Williamson

Dependent nonparametric processes extend distributions over measures, such as the Dirichlet process and the beta process, to give distributions over collections of measures, typically indexed by values in some covariate space. Such models are appropriate priors when exchangeability assumptions do not hold, and instead we want our model to vary fluidly with some set of covariates. Since the concept of dependent nonparametric processes was formalized by MacEachern, there have been a number of models proposed and used in the statistics and machine learning literatures. Many of these models exhibit underlying similarities, an understanding of which, we hope, will help in selecting an appropriate prior, developing new models, and leveraging inference techniques.


Statistics and Computing | 2017

Restricted Indian buffet processes

Finale Doshi-Velez; Sinead A. Williamson

Latent feature models are a powerful tool for modeling data with globally-shared features. Nonparametric distributions over exchangeable sets of features, such as the Indian Buffet Process, offer modeling flexibility by letting the number of latent features be unbounded. However, current models impose implicit distributions over the number of latent features per data point, and these implicit distributions may not match our knowledge about the data. In this work, we demonstrate how the restricted Indian buffet process circumvents this restriction, allowing arbitrary distributions over the number of features in an observation. We discuss several alternative constructions of the model and apply the insights to develop Markov Chain Monte Carlo and variational methods for simulation and posterior inference.


Machine Learning | 2018

A scalable preference model for autonomous decision-making

Markus Peters; Maytal Saar-Tsechansky; Wolfgang Ketter; Sinead A. Williamson; Perry Groot; Tom Heskes

Emerging domains such as smart electric grids require decisions to be made autonomously, based on the observed behaviors of large numbers of connected consumers. Existing approaches either lack the flexibility to capture nuanced, individualized preference profiles, or scale poorly with the size of the dataset. We propose a preference model that combines flexible Bayesian nonparametric priors—providing state-of-the-art predictive power—with well-justified structural assumptions that allow a scalable implementation. The Gaussian process scalable preference model via Kronecker factorization (GaSPK) model provides accurate choice predictions and principled uncertainty estimates as input to decision-making tasks. In consumer choice settings where alternatives are described by few key attributes, inference in our model is highly efficient and scalable to tens of thousands of choices.


international conference on machine learning | 2010

The IBP Compound Dirichlet Process and its Application to Focused Topic Modeling

Sinead A. Williamson; Chong Wang; Katherine A. Heller; David M. Blei


international conference on machine learning | 2013

Parallel Markov Chain Monte Carlo for Nonparametric Mixture Models

Sinead A. Williamson; Avinava Dubey; Eric P. Xing


neural information processing systems | 2016

Variance Reduction in Stochastic Gradient Langevin Dynamics

Kumar Avinava Dubey; Sashank J. Reddi; Sinead A. Williamson; Barnabás Póczos; Alexander J. Smola; Eric P. Xing


Journal of Machine Learning Research | 2016

Nonparametric network models for link prediction

Sinead A. Williamson


international conference on machine learning | 2012

Modeling Images using Transformed Indian Buffet Processes

Ke Zhai; Yuening Hu; Sinead A. Williamson; Jordan L. Boyd-Graber

Collaboration


Dive into the Sinead A. Williamson's collaboration.

Top Co-Authors

Avatar

Eric P. Xing

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Avinava Dubey

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Maurice Diesendruck

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chong Wang

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Guy W. Cole

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge