Daniel M. Roy
University of Toronto
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Daniel M. Roy.
Bernoulli | 2016
Creighton Heaukulani; Daniel M. Roy
We characterize the combinatorial structure of conditionally-i.i.d. sequences of negative binomial processes with a common beta process base measure. In Bayesian nonparametric applications, such processes have served as models for latent multisets of features underlying data. Analogously, random subsets arise from conditionally-i.i.d. sequences of Bernoulli processes with a common beta process base measure, in which case the combinatorial structure is described by the Indian buffet process. Our results give a count analogue of the Indian buffet process, which we call a negative binomial Indian buffet process. As an intermediate step toward this goal, we provide a construction for the beta negative binomial process that avoids a representation of the underlying beta process base measure. We describe the key Markov kernels needed to use a NB-IBP representation in a Markov Chain Monte Carlo algorithm targeting a posterior distribution.
Mathematical Structures in Computer Science | 2017
Nathanael Leedom Ackerman; Cameron E. Freer; Daniel M. Roy
We show that the disintegration operator on a complete separable metric space along a projection map, restricted to measures for which there is a unique continuous disintegration, is strongly Weihrauch equivalent to the limit operator Lim. When a measure does not have a unique continuous disintegration, we may still obtain a disintegration when some basis of continuity sets has the Vitali covering property with respect to the measure; the disintegration, however, may depend on the choice of sets. We show that, when the basis is computable, the resulting disintegration is strongly Weihrauch reducible to Lim, and further exhibit a single distribution realizing this upper bound.
uncertainty in artificial intelligence | 2016
Matej Balog; Balaji Lakshminarayanan; Zoubin Ghahramani; Daniel M. Roy; Yee Whye Teh
We introduce the Mondrian kernel, a fast random feature approximation to the Laplace kernel. It is suitable for both batch and online learning, and admits a fast kernel-width-selection procedure as the random features can be re-used efficiently for all kernel widths. The features are constructed by sampling trees via a Mondrian process [Roy and Teh, 2009], and we highlight the connection to Mondrian forests [Lakshminarayanan et al., 2014], where trees are also sampled via a Mondrian process, but fit independently. This link provides a new insight into the relationship between kernel methods and random forests.
Annals of Applied Probability | 2018
Marco Battiston; Stefano Favaro; Daniel M. Roy; Yee Whye Teh
We characterize the class of exchangeable feature allocations assigning probability
arXiv: Artificial Intelligence | 2012
Cameron E. Freer; Daniel M. Roy; Joshua B. Tenenbaum
V_{n,k}prod_{l=1}^{k}W_{m_{l}}U_{n-m_{l}}
international colloquium on automata, languages and programming | 2018
Sam Staton; Dario Stein; Hongseok Yang; Nathanael Leedom Ackerman; Cameron E. Freer; Daniel M. Roy
to a feature allocation of
uncertainty in artificial intelligence | 2015
Gintare Karolina Dziugaite; Daniel M. Roy; Zoubin Ghahramani
n
Presented at: UNSPECIFIED. (2008) | 2008
Daniel M. Roy; Yee Whye Teh
individuals, displaying
uncertainty in artificial intelligence | 2017
Gintare Karolina Dziugaite; Daniel M. Roy
k
arXiv: Computer Vision and Pattern Recognition | 2016
Gintare Karolina Dziugaite; Zoubin Ghahramani; Daniel M. Roy
features with counts