Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Daniel M. Roy is active.

Publication


Featured researches published by Daniel M. Roy.


Bernoulli | 2016

The combinatorial structure of beta negative binomial processes

Creighton Heaukulani; Daniel M. Roy

We characterize the combinatorial structure of conditionally-i.i.d. sequences of negative binomial processes with a common beta process base measure. In Bayesian nonparametric applications, such processes have served as models for latent multisets of features underlying data. Analogously, random subsets arise from conditionally-i.i.d. sequences of Bernoulli processes with a common beta process base measure, in which case the combinatorial structure is described by the Indian buffet process. Our results give a count analogue of the Indian buffet process, which we call a negative binomial Indian buffet process. As an intermediate step toward this goal, we provide a construction for the beta negative binomial process that avoids a representation of the underlying beta process base measure. We describe the key Markov kernels needed to use a NB-IBP representation in a Markov Chain Monte Carlo algorithm targeting a posterior distribution.


Mathematical Structures in Computer Science | 2017

On computability and disintegration

Nathanael Leedom Ackerman; Cameron E. Freer; Daniel M. Roy

We show that the disintegration operator on a complete separable metric space along a projection map, restricted to measures for which there is a unique continuous disintegration, is strongly Weihrauch equivalent to the limit operator Lim. When a measure does not have a unique continuous disintegration, we may still obtain a disintegration when some basis of continuity sets has the Vitali covering property with respect to the measure; the disintegration, however, may depend on the choice of sets. We show that, when the basis is computable, the resulting disintegration is strongly Weihrauch reducible to Lim, and further exhibit a single distribution realizing this upper bound.


uncertainty in artificial intelligence | 2016

The Mondrian kernel

Matej Balog; Balaji Lakshminarayanan; Zoubin Ghahramani; Daniel M. Roy; Yee Whye Teh

We introduce the Mondrian kernel, a fast random feature approximation to the Laplace kernel. It is suitable for both batch and online learning, and admits a fast kernel-width-selection procedure as the random features can be re-used efficiently for all kernel widths. The features are constructed by sampling trees via a Mondrian process [Roy and Teh, 2009], and we highlight the connection to Mondrian forests [Lakshminarayanan et al., 2014], where trees are also sampled via a Mondrian process, but fit independently. This link provides a new insight into the relationship between kernel methods and random forests.


Annals of Applied Probability | 2018

A characterization of product-form exchangeable feature probability functions

Marco Battiston; Stefano Favaro; Daniel M. Roy; Yee Whye Teh

We characterize the class of exchangeable feature allocations assigning probability


arXiv: Artificial Intelligence | 2012

Towards common-sense reasoning via conditional simulation: legacies of Turing in Artificial Intelligence.

Cameron E. Freer; Daniel M. Roy; Joshua B. Tenenbaum

V_{n,k}prod_{l=1}^{k}W_{m_{l}}U_{n-m_{l}}


international colloquium on automata, languages and programming | 2018

The Beta-Bernoulli process and algebraic effects

Sam Staton; Dario Stein; Hongseok Yang; Nathanael Leedom Ackerman; Cameron E. Freer; Daniel M. Roy

to a feature allocation of


uncertainty in artificial intelligence | 2015

Training generative neural networks via maximum mean discrepancy optimization

Gintare Karolina Dziugaite; Daniel M. Roy; Zoubin Ghahramani

n


Presented at: UNSPECIFIED. (2008) | 2008

The Mondrian Process.

Daniel M. Roy; Yee Whye Teh

individuals, displaying


uncertainty in artificial intelligence | 2017

Computing Nonvacuous Generalization Bounds for Deep (Stochastic) Neural Networks with Many More Parameters than Training Data.

Gintare Karolina Dziugaite; Daniel M. Roy

k


arXiv: Computer Vision and Pattern Recognition | 2016

A study of the effect of JPG compression on adversarial images.

Gintare Karolina Dziugaite; Zoubin Ghahramani; Daniel M. Roy

features with counts

Collaboration


Dive into the Daniel M. Roy's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Cameron E. Freer

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Joshua B. Tenenbaum

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge