Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Steven N. MacEachern is active.

Publication


Featured researches published by Steven N. MacEachern.


Journal of Computational and Graphical Statistics | 1998

Estimating mixture of dirichlet process models

Steven N. MacEachern; Peter Müller

Abstract Current Gibbs sampling schemes in mixture of Dirichlet process (MDP) models are restricted to using “conjugate” base measures that allow analytic evaluation of the transition probabilities when resampling configurations, or alternatively need to rely on approximate numeric evaluations of some transition probabilities. Implementation of Gibbs sampling in more general MDP models is an open and important problem because most applications call for the use of nonconjugate base measures. In this article we propose a conceptual framework for computational strategies. This framework provides a perspective on current methods, facilitates comparisons between them, and leads to several new methods that expand the scope of MDP models to nonconjugate situations. We discuss one in detail. The basic strategy is based on expanding the parameter vector, and is applicable for MDP models with arbitrary base measure and likelihood. Strategies are also presented for the important class of normal-normal MDP models and...


Communications in Statistics - Simulation and Computation | 1994

Estimating normal means with a conjugate style dirichlet process prior

Steven N. MacEachern

The problem of estimating many normal means is approached by means of an hierarchical model. The hierarchical model is the standard conjugate model with one exception: the normal distribution at the middle stage is replaced by a Dirichlet process with a normal shape. Estimation for this model is accomplished through the implementation of the Gibbs sampler (see Escobar and West,1991)Thisarticle describes a new Gibbs sampler algorithm that is implemented on a collapsed state space Results that apply to a general setting are obtained, suggesting that a collapse of the state space willimprove the rate of convergence of the Gibbs sampler. An example shows that the proposed collapse of the state space may result in a dramatically improved algorithm


Journal of the American Statistical Association | 2004

An ANOVA Model for Dependent Random Measures

Maria De Iorio; Peter Müller; Gary L. Rosner; Steven N. MacEachern

We consider dependent nonparametric models for related random probability distributions. For example, the random distributions might be indexed by a categorical covariate indicating the treatment levels in a clinical trial and might represent random effects distributions under the respective treatment combinations. We propose a model that describes dependence across random distributions in an analysis of variance (ANOVA)-type fashion. We define a probability model in such a way that marginally each random measure follows a Dirichlet process (DP) and use the dependent Dirichlet process to define the desired dependence across the related random measures. The resulting probability model can alternatively be described as a mixture of ANOVA models with a DP prior on the unknown mixing measure. The main features of the proposed approach are ease of interpretation and computational simplicity. Because the model follows the standard ANOVAstructure, interpretation and inference parallels conventions for ANOVA models. This includes the notion of main effects, interactions, contrasts, and the like. Of course, the analogies are limited to structure and interpretation. The actual objects of the inference are random distributions instead of the unknown normal means in standard ANOVA models. Besides interpretation and model structure, another important feature of the proposed approach is ease of posterior simulation. Because the model can be rewritten as a DP mixture of ANOVAmodels, it inherits all computational advantages of standard DP mixture models. This includes availability of efficient Gibbs sampling schemes for posterior simulation and ease of implementation of even high-dimensional applications. Complexity of implementing posterior simulation is—at least conceptually—dimension independent.


Journal of the American Statistical Association | 2005

Bayesian Nonparametric Spatial Modeling With Dirichlet Process Mixing

Alan E. Gelfand; Athanasios Kottas; Steven N. MacEachern

Customary modeling for continuous point-referenced data assumes a Gaussian process that is often taken to be stationary. When such models are fitted within a Bayesian framework, the unknown parameters of the process are assumed to be random, so a random Gaussian process results. Here we propose a novel spatial Dirichlet process mixture model to produce a random spatial process that is neither Gaussian nor stationary. We first develop a spatial Dirichlet process model for spatial data and discuss its properties. Because of familiar limitations associated with direct use of Dirichlet process models, we introduce mixing by convolving this process with a pure error process. We then examine properties of models created through such Dirichlet process mixing. In the Bayesian framework, we implement posterior inference using Gibbs sampling. Spatial prediction raises interesting questions, but these can be handled. Finally, we illustrate the approach using simulated data, as well as a dataset involving precipitation measurements over the Languedoc-Roussillon region in southern France.


The American Statistician | 1994

Subsampling the Gibbs Sampler

Steven N. MacEachern; L. Mark Berliner

Abstract This article provides a justification of the ban against sub-sampling the output of a stationary Markov chain that is suitable for presentation in undergraduate and beginning graduate-level courses. The justification does not rely on reversibility of the chain as does Geyers (1992) argument and so applies to the usual implementation of the Gibbs sampler.


Archive | 1998

Computational Methods for Mixture of Dirichlet Process Models

Steven N. MacEachern

This chapter lays out the basic computational strategies for models based on mixtures of Dirichlet processes. I describe the basic algorithm and give advice on how to improve this algorithm through a collapse of the state space of the Markov chain and through blocking of variates for generation. The computational methods are illustrated with a beta-binomial example and with the bioassay problem. Some advice is given for dealing with models that have little or no conjugacy present.


Journal of The Royal Statistical Society Series B-statistical Methodology | 2002

A new ranked set sample estimator of variance

Steven N. MacEachern; Omer Ozturk; Douglas A. Wolfe; Gregory V. Stark

Summary. We develop an unbiased estimator of the variance of a population based on a ranked set sample. We show that this new estimator is better than estimating the variance based on a simple random sample and more efficient than the estimator based on a ranked set sample proposed by Stokes. Also, a test to determine the effectiveness of the judgment ordering process is proposed.


Journal of the American Statistical Association | 2006

Nonparametric Two-Sample Methods for Ranked-Set Sample Data

Michael A. Fligner; Steven N. MacEachern

A new collection of procedures is developed for the analysis of two-sample, ranked-set samples, providing an alternative to the Bohn–Wolfe procedure. These procedures split the data based on the ranks in the ranked-set sample and lead to tests for the centers of distributions, confidence intervals, and point estimators. The advantages of the new tests are that they require essentially no assumptions about the mechanism by which rankings are imperfect, that they maintain their level whether rankings are perfect or imperfect, that they lead to generalizations of the Bohn–Wolfe procedure that can be used to increase power in the case of perfect rankings, and that they allow one to analyze both balanced and unbalanced ranked-set samples. A new class of imperfect ranking models is proposed, and the performance of the procedure is investigated under these models. When rankings are random, a theorem is presented which characterizes efficient data splits. Because random rankings are equivalent to iid samples, this theorem applies to a wide class of statistics and has implications for a variety of computationally intensive methods.


Archive | 2000

Efficient MCMC Schemes for Robust Model Extensions Using Encompassing Dirichlet Process Mixture Models

Steven N. MacEachern; Peter Müller

We propose that one consider sensitivity analysis by embedding standard parametric models in model extensions defined by replacing a parametric probability model with a nonparametric extension. The non-parametric model could replace the entire probability model, or some level of a hierarchical model. Specifically, we define nonparametric extensions of a parametric probability model using Dirichlet process (DP) priors. Similar approaches have been used in the literature to implement formal model fit diagnostics (Carota et al., 1996).


Journal of Time Series Analysis | 1997

Bayesian Models for Non‐linear Autoregressions

Peter Müller; Mike West; Steven N. MacEachern

We discuss classes of Bayesian mixture models for nonlinear autoregressive times series, based on developments in semiparametric Bayesian density estimation in recent years. The development involves formal classes of multivariate discrete mixture distributions, providing flexibility in modeling arbitrary nonlinearities in time series structure and a formal inferential framework within which to address the problems of inference and prediction. The models relate naturally to existing kernel and related methods, threshold models and others, although they offer major advances in terms of parameter estimation and predictive calculations. Theoretic al and computational aspects are developed here, the latter involving efficient simulation of posterior and predictive distributions. Various examples illustrate our perspectives on identification and inference using this mixture approach

Collaboration


Dive into the Steven N. MacEachern's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Peter Müller

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Juhee Lee

Ohio State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge