Stephen DeSalvo
University of California, Los Angeles
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Stephen DeSalvo.
Combinatorics, Probability & Computing | 2016
Richard Arratia; Stephen DeSalvo
We propose a new method, probabilistic divide-and-conquer, for improving the success probability in rejection sampling. For the example of integer partitions, there is an ideal recursive scheme which improves the rejection cost from asymptotically order
Proceedings of the Royal Society of London A: Mathematical, Physical and Engineering Sciences | 2010
Paul K. Newton; Stephen DeSalvo
n^{3/4}
Advances in Applied Mathematics | 2018
Stephen DeSalvo
to a constant. We show other examples for which a non--recursive, one--time application of probabilistic divide-and-conquer removes a substantial fraction of the rejection sampling cost. We also present a variation of probabilistic divide-and-conquer for generating i.i.d. samples that exploits features of the coupon collectors problem, in order to obtain a cost that is sublinear in the number of samples.
Random Structures and Algorithms | 2018
Harry Crane; Stephen DeSalvo; Sergi Elizalde
We study properties of an ensemble of Sudoku matrices (a special type of doubly stochastic matrix when normalized) using their statistically averaged singular values. The determinants are very nearly Cauchy distributed about the origin. The largest singular value is , while the others decrease approximately linearly. The normalized singular values (obtained by dividing each singular value by the sum of all nine singular values) are then used to calculate the average Shannon entropy of the ensemble, a measure of the distribution of ‘energy’ among the singular modes and interpreted as a measure of the disorder of a typical matrix. We show the Shannon entropy of the ensemble to be 1.7331±0.0002, which is slightly lower than an ensemble of 9×9 Latin squares, but higher than a certain collection of 9×9 random matrices used for comparison. Using the notion of relative entropy or Kullback–Leibler divergence, which gives a measure of how one distribution differs from another, we show that the relative entropy between the ensemble of Sudoku matrices and Latin squares is of the order of 10−5. By contrast, the relative entropy between Sudoku matrices and the collection of random matrices has the much higher value, being of the order of 10−3, with the Shannon entropy of the Sudoku matrices having better distribution among the modes. We finish by ‘reconstituting’ the ‘average’ Sudoku matrix from its averaged singular components.
Electronic Notes in Discrete Mathematics | 2017
Stephen DeSalvo
We present a probabilistic divide-and-conquer (PDC) method for \emph{exact} sampling of conditional distributions of the form
Algorithmica | 2017
Stephen DeSalvo
\mathcal{L}( {\bf X}\, |\, {\bf X} \in E)
Annals of Combinatorics | 2017
Richard Arratia; Stephen DeSalvo
, where
Advances in Applied Probability | 2015
Richard Arratia; Stephen DeSalvo
{\bf X}
Ramanujan Journal | 2015
Stephen DeSalvo; Igor Pak
is a random variable on
Annals of Combinatorics | 2013
Richard Arratia; Stephen DeSalvo
\mathcal{X}