Stephen G. Walker
University of Texas at Austin
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Stephen G. Walker.
Communications in Statistics - Simulation and Computation | 2007
Stephen G. Walker
We provide a new approach to the sampling of the well known mixture of Dirichlet process model. Recent attention has focused on retention of the random distribution function in the model, but sampling algorithms have then suffered from the countably infinite representation these distributions have. The key to the algorithm detailed in this paper, which also keeps the random distribution functions, is the introduction of a latent variable which allows a finite number, which is known, of objects to be sampled within each iteration of a Gibbs sampler.
Annals of Statistics | 2004
Stephen G. Walker
We use martingales to study Bayesian consistency. We derive sufficient conditions for both Hellinger and Kullback-Leibler consistency, which do not rely on the use of a sieve. Alternative sufficient conditions for Hellinger consistency are also found and demonstrated on examples.
Journal of Computational and Graphical Statistics | 2001
Paul Damien; Stephen G. Walker
We consider the Bayesian analysis of constrained parameter and truncated data problems within a Gibbs sampling framework and concentrate on sampling truncated densities that arise as full conditional densities within the context of the Gibbs sampler. In particular, we restrict attention to the normal, beta, and gamma densities. We demonstrate that, in many instances, it is possible to introduce a latent variable which facilitates an easy solution to the problem. We also discuss a novel approach to sampling truncated densities via a “black-box” algorithm, based on the latent variable idea, valid outside of the context of a Gibbs sampler.
IEEE Transactions on Visualization and Computer Graphics | 2011
Jonathan M. Stott; Peter Rodgers; Juan Carlos Mart́ınez-Ovando; Stephen G. Walker
This paper describes an automatic mechanism for drawing metro maps. We apply multicriteria optimization to find effective placement of stations with a good line layout and to label the map unambiguously. A number of metrics are defined, which are used in a weighted sum to find a fitness value for a layout of the map. A hill climbing optimizer is used to reduce the fitness value, and find improved map layouts. To avoid local minima, we apply clustering techniques to the map-the hill climber moves both stations and clusters when finding improved layouts. We show the method applied to a number of metro maps, and describe an empirical study that provides some quantitative evidence that automatically-drawn metro maps can help users to find routes more efficiently than either published maps or undistorted maps. Moreover, we have found that, in these cases, study subjects indicate a preference for automatically-drawn maps over the alternatives.
Journal of The Royal Statistical Society Series B-statistical Methodology | 2001
Stephen G. Walker; Nils Lid Hjort
We consider a sequence of posterior distributions based on a data-dependent prior (which we shall refer to as a pseudoposterior distribution) and establish simple conditions under which the sequence is Hellinger consistent. It is shown how investigations into these pseudo posteriors assist with the understanding of some true posterior distributions, including Polya trees, the infinite dimensional exponential family and mixture models.
Scandinavian Journal of Statistics | 2002
Luis E. Nieto-Barajas; Stephen G. Walker
This paper generalizes the discrete time independent increment beta process of Hjort (1990), for modelling discrete failure times, and also generalizes the independent gamma process for modelling piecewise constant hazard rates (Walker and Mallick, 1997). The generalizations are from independent increment to Markov increment prior processes allowing the modelling of smoothness. We derive posterior distributions and undertake a full Bayesian analysis.
Journal of The Royal Statistical Society Series B-statistical Methodology | 2016
Pier Giovanni Bissiri; Christopher Holmes; Stephen G. Walker
Summary We propose a framework for general Bayesian inference. We argue that a valid update of a prior belief distribution to a posterior can be made for parameters which are connected to observations through a loss function rather than the traditional likelihood function, which is recovered as a special case. Modern application areas make it increasingly challenging for Bayesians to attempt to model the true data‐generating mechanism. For instance, when the object of interest is low dimensional, such as a mean or median, it is cumbersome to have to achieve this via a complete model for the whole data distribution. More importantly, there are settings where the parameter of interest does not directly index a family of density functions and thus the Bayesian approach to learning about such parameters is currently regarded as problematic. Our framework uses loss functions to connect information in the data to functionals of interest. The updating of beliefs then follows from a decision theoretic approach involving cumulative loss functions. Importantly, the procedure coincides with Bayesian updating when a true likelihood is known yet provides coherent subjective inference in much more general settings. Connections to other inference frameworks are highlighted.
Journal of Computational and Graphical Statistics | 2011
Jim E. Griffin; Stephen G. Walker
This article describes posterior simulation methods for mixture models whose mixing distribution has a Normalized Random Measure prior. The methods use slice sampling ideas and introduce no truncation error. The approach can be easily applied to both homogeneous and nonhomogeneous Normalized Random Measures and allows the updating of the parameters of the random measure. The methods are illustrated on data examples using both Dirichlet and Normalized Generalized Gamma process priors. In particular, the methods are shown to be computationally competitive with previously developed samplers for Dirichlet process mixture models. Matlab code to implement these methods is available as supplemental material.
Annals of Applied Probability | 2008
Antonio Lijoi; Igor Prünster; Stephen G. Walker
We consider discrete nonparametric priors which induce Gibbs-type exchangeable random partitions and investigate their posterior behavior in detail. In particular, we deduce conditional distributions and the corresponding Bayesian nonparametric estimators, which can be readily exploited for predicting various features of additional samples. The results provide useful tools for genomic applications where prediction of future outcomes is required.
Annals of Statistics | 2004
Luis E. Nieto-Barajas; Igor Prünster; Stephen G. Walker
This paper introduces and studies a new class of nonparametric prior distributions. Random probability distribution functions are constructed via normalization of random measures driven by increasing additive processes. In particular, we present results for the distribution of means under both prior and posterior conditions and, via the use of strategic latent variables, undertake a full Bayesian analysis. Our class of priors includes the well-known and widely used mixture of a Dirichlet process.