Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Nicholas G. Polson is active.

Publication


Featured researches published by Nicholas G. Polson.


Journal of Business & Economic Statistics | 1994

Bayesian Analysis of Stochastic Volatility Models

Eric Jacquier; Nicholas G. Polson; Peter E. Rossi

New techniques for the analysis of stochastic volatility models in which the logarithm of conditional variance follows an autoregressive model are developed. A cyclic Metropolis algorithm is used to construct a Markov-chain simulation tool. Simulations from this Markov chain coverage in distribution to draws from the posterior distribution enabling exact finite-sample inference. The exact solution to the filtering/smoothing problem of inferring about the unobserved variance states is a by-product of our Markov-chain method. In addition, multistep-ahead predictive densities can be constructed that reflect both inherent model variability and parameter uncertainty. We illustrate our method by analyzing both daily and weekly data on stock returns and exchange rates. Sampling experiments are conducted to compare the performance of Bayes estimators to method of moments and quasi-maximum likelihood estimators proposed in the literature. In both parameter estimation and filtering, the Bayes estimators outperform these other approaches.


Journal of the American Statistical Association | 1992

A monte carlo approach to nonnormal and nonlinear state–space modeling

Bradley P. Carlin; Nicholas G. Polson; David S. Stoffer

Abstract A solution to multivariate state-space modeling, forecasting, and smoothing is discussed. We allow for the possibilities of nonnormal errors and nonlinear functionals in the state equation, the observational equation, or both. An adaptive Monte Carlo integration technique known as the Gibbs sampler is proposed as a mechanism for implementing a conceptually and computationally simple solution in such a framework. The methodology is a general strategy for obtaining marginal posterior densities of coefficients in the model or of any of the unknown elements of the state space. Missing data problems (including the k-step ahead prediction problem) also are easily incorporated into this framework. We illustrate the broad applicability of our approach with two examples: a problem involving nonnormal error distributions in a linear model setting and a one-step ahead prediction problem in a situation where both the state and observational equations are nonlinear and involve unknown parameters.


Journal of the American Statistical Association | 2013

Bayesian Inference for Logistic Models Using Pólya–Gamma Latent Variables

Nicholas G. Polson; James G. Scott; Jesse Windle

We propose a new data-augmentation strategy for fully Bayesian inference in models with binomial likelihoods. The approach appeals to a new class of Pólya–Gamma distributions, which are constructed in detail. A variety of examples are presented to show the versatility of the method, including logistic regression, negative binomial regression, nonlinear mixed-effect models, and spatial models for count data. In each case, our data-augmentation strategy leads to simple, effective methods for posterior inference that (1) circumvent the need for analytic approximations, numerical integration, or Metropolis–Hastings; and (2) outperform other known data-augmentation strategies, both in ease of use and in computational efficiency. All methods, including an efficient sampler for the Pólya–Gamma distribution, are implemented in the R package BayesLogit. Supplementary materials for this article are available online.


Statistical Science | 2010

Particle Learning and Smoothing

Carlos M. Carvalho; Michael Johannes; Hedibert F. Lopes; Nicholas G. Polson

Particle learning (PL) provides state filtering, sequential parameter learning and smoothing in a general class of state space models. Our approach extends existing particle methods by incorporating the estimation of static parameters via a fully-adapted filter that utilizes conditional sufficient statistics for parameters and/or states as particles. State smoothing in the presence of parameter uncertainty is also solved as a by-product of PL. In a number of examples, we show that PL outperforms existing particle filtering alternatives and proves to be a competitor to MCMC.


Journal of Econometrics | 2000

A Bayesian analysis of the multinomial probit model with fully identified parameters

Robert E. McCulloch; Nicholas G. Polson; Peter E. Rossi

We present a new prior and corresponding algorithm for Bayesian analysis of the multinomial probit model. Our new approach places a prior directly on the identified parameter space. The key is the specification of a prior on the covariance matrix so that the (1,1) element if fixed at 1 and it is possible to draw from the posterior using standard distributions. Analytical results are derived which can be used to aid in assessment of the prior.


Bayesian Analysis | 2012

On the Half-Cauchy Prior for a Global Scale Parameter

Nicholas G. Polson; James G. Scott

This paper argues that the half-Cauchy distribution should replace the inverseGamma distribution as a default prior for a top-level scale parameter in Bayesian hierarchical models, at least for cases where a proper prior is necessary. Our arguments involve a blend of Bayesian and frequentist reasoning, and are intended to complement the original case made by Gelman (2006) in support of the folded-t family of priors. First, we generalize the half-Cauchy prior to the wider class of hypergeometric inverted-beta priors. We derive expressions for posterior moments and marginal densities when these priors are used for a top-level normal variance in a Bayesian hierarchical model. We go on to prove a proposition that, together with the results for moments and marginals, allows us to characterize the frequentist risk of the Bayes estimators under all global-shrinkage priors in the class. These theoretical results, in turn, allow us to study the frequentist properties of the half-Cauchy prior versus a wide class of alternatives. The half-Cauchy occupies a sensible “middle ground” within this class: it performs very well near the origin, but does not lead to drastic compromises in other parts of the parameter space. This provides an alternative, classical justification for the repeated, routine use of this prior. We also consider situations where the underlying mean vector is sparse, where we argue that the usual conjugate choice of an inverse-gamma prior is particularly inappropriate, and can lead to highly distorted posterior inferences. Finally, we briefly summarize some open issues in the specification of default priors for scale terms in hierarchical models.


Bayesian Analysis | 2011

Data augmentation for support vector machines

Nicholas G. Polson; Steven L. Scott

Summary This paper presents a latent variable representation of regularized support vector machines (SVM’s) that enables EM, ECME or MCMC algorithms to provide parameter estimates. We verify our representation by demonstrating that minimizing the SVM optimality criterion together with the parameter regularization penalty is equivalent to finding the mode of a mean-variance mixture of normals pseudo-posterior distribution. The latent variables in the mixture representation lead to EM and ECME point estimates of SVM parameters, as well as MCMC algorithms based on Gibbs sampling that can bring Bayesian tools for Gaussian linear models to bear on SVM’s. We show how to implementing SVM’s with spike-and-slab priors and running them against data from a standard spam filtering data set.


Review of Financial Studies | 2009

Optimal Filtering of Jump Diffusions: Extracting Latent States from Asset Prices

Michael Johannes; Nicholas G. Polson; Jonathan R. Stroud

This paper provides an optimal filtering methodology in discretely observed continuous-time jump-diffusion models. Although the filtering problem has received little attention, it is useful for estimating latent states, forecasting volatility and returns, computing model diagnostics such as likelihood ratios, and parameter estimation. Our approach combines time-discretization schemes with Monte Carlo methods. It is quite general, applying in nonlinear and multivariate jump-diffusion models and models with nonanalytic observation equations. We provide a detailed analysis of the filters performance, and analyze four applications: disentangling jumps from stochastic volatility, forecasting volatility, comparing models via likelihood ratios, and filtering using option prices and returns. The Author 2009. Published by Oxford University Press on behalf of The Society for Financial Studies. All rights reserved. For Permissions, please e-mail: [email protected], Oxford University Press.


Canadian Journal of Statistics-revue Canadienne De Statistique | 1991

Inference for nonconjugate Bayesian Models using the Gibbs sampler

Bradley P. Carlin; Nicholas G. Polson

A Bayesian approach to modeling a rich class of nonconjugate problems is presented. An adaptive Monte Carlo integration technique known as the Gibbs sampler is proposed as a mechanism for implementing a conceptually and computationally simple solution in such a framework. The result is a general strategy for obtaining marginal posterior densities under changing specification of the model error densities and related prior densities. We illustrate the approach in a nonlinear regression setting, comparing the merits of three candidate error distributions.


Social Science Research Network | 2003

MCMC Methods for Continuous-Time Financial Econometrics

Michael Johannes; Nicholas G. Polson

This chapter develops Markov Chain Monte Carlo (MCMC) methods for Bayesian inference in continuous-time asset pricing models. The Bayesian solution to the inference problem is the distribution of parameters and latent variables conditional on observed data, and MCMC methods provide a tool for exploring these high-dimensional, complex distributions. We first provide a description of the foundations and mechanics of MCMC algorithms. This includes a discussion of the Clifford-Hammersley theorem, the Gibbs sampler, the Metropolis-Hastings algorithm, and theoretical convergence properties of MCMC algorithms. We next provide a tutorial on building MCMC algorithms for a range of continuous-time asset pricing models. We include detailed examples for equity price models, option pricing models, term structure models, and regime-switching models. Finally, we discuss the issue of sequential Bayesian inference, both for parameters and state variables.

Collaboration


Dive into the Nicholas G. Polson's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

James G. Scott

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar

Eric Jacquier

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Peter E. Rossi

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge