Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Larry Wasserman is active.

Publication


Featured researches published by Larry Wasserman.


Journal of the American Statistical Association | 1996

The Selection of Prior Distributions by Formal Rules

Robert E. Kass; Larry Wasserman

Abstract Subjectivism has become the dominant philosophical foundation for Bayesian inference. Yet in practice, most Bayesian analyses are performed with so-called “noninformative” priors, that is, priors constructed by some formal rule. We review the plethora of techniques for constructing such priors and discuss some of the practical and philosophical issues that arise when they are used. We give special emphasis to Jeffreyss rules and discuss the evolution of his viewpoint about the interpretation of priors, away from unique representation of ignorance toward the notion that they should be chosen by convention. We conclude that the problems raised by the research on priors chosen by formal rules are serious and may not be dismissed lightly: When sample sizes are small (relative to the number of parameters being estimated), it is dangerous to put faith in any “default” solution; but when asymptotics take over, Jeffreyss rules and their variants remain reasonable choices. We also provide an annotated b...


Journal of the American Statistical Association | 1995

A Reference Bayesian Test for Nested Hypotheses and its Relationship to the Schwarz Criterion

Robert E. Kass; Larry Wasserman

Abstract To compute a Bayes factor for testing H 0: ψ = ψ0 in the presence of a nuisance parameter β, priors under the null and alternative hypotheses must be chosen. As in Bayesian estimation, an important problem has been to define automatic, or “reference,” methods for determining priors based only on the structure of the model. In this article we apply the heuristic device of taking the amount of information in the prior on ψ equal to the amount of information in a single observation. Then, after transforming β to be “null orthogonal” to ψ, we take the marginal priors on β to be equal under the null and alternative hypotheses. Doing so, and taking the prior on ψ to be Normal, we find that the log of the Bayes factor may be approximated by the Schwarz criterion with an error of order O p (n −½), rather than the usual error of order O p (1). This result suggests the Schwarz criterion should provide sensible approximate solutions to Bayesian testing problems, at least when the hypotheses are nested. When...


Archive | 2004

All of Statistics

Larry Wasserman

The first € price and the £ and


Test | 1994

An overview of robust Bayesian analysis

James O. Berger; Elías Moreno; Luis R. Pericchi; M. Jesús Bayarri; José M. Bernardo; Juan Antonio Cano; Julián de la Horra; Jacinto Martín; David Ríos-Insúa; Bruno Betrò; Anirban DasGupta; Paul Gustafson; Larry Wasserman; Joseph B. Kadane; Cid Srinivasan; Michael Lavine; Anthony O’Hagan; Wolfgang Polasek; Christian P. Robert; Constantinos Goutis; Fabrizio Ruggeri; Gabriella Salinetti; Siva Sivaganesan

price are net prices, subject to local VAT. Prices indicated with * include VAT for books; the €(D) includes 7% for Germany, the €(A) includes 10% for Austria. Prices indicated with ** include VAT for electronic products; 19% for Germany, 20% for Austria. All prices exclusive of carriage charges. Prices and other details are subject to change without notice. All errors and omissions excepted. L. Wasserman All of Statistics


Journal of the American Statistical Association | 1997

Practical Bayesian Density Estimation Using Mixtures of Normals

Kathryn Roeder; Larry Wasserman

SummaryRobust Bayesian analysis is the study of the sensitivity of Bayesian answers to uncertain inputs. This paper seeks to provide an overview of the subject, one that is accessible to statisticians outside the field. Recent developments in the area are also reviewed, though with very uneven emphasis.


Journal of the American Statistical Association | 1995

Computing Bayes Factors Using a Generalization of the Savage-Dickey Density Ratio

Isabella Verdinelli; Larry Wasserman

Abstract Mixtures of normals provide a flexible model for estimating densities in a Bayesian framework. There are some difficulties with this model, however. First, standard reference priors yield improper posteriors. Second, the posterior for the number of components in the mixture is not well defined (if the reference prior is used). Third, posterior simulation does not provide a direct estimate of the posterior for the number of components. We present some practical methods for coping with these problems. Finally, we give some results on the consistency of the method when the maximum number of components is allowed to grow with the sample size.


Journal of the American Statistical Association | 1997

Computing Bayes Factors by Combining Simulation and Asymptotic Approximations

Thomas J. DiCiccio; Robert E. Kass; Adrian E. Raftery; Larry Wasserman

Abstract We present a simple method for computing Bayes factors. The method derives from observing that in general, a Bayes factor can be written as the product of a quantity called the Savage-Dickey density ratio and a correction factor; both terms are easily estimated from posterior simulation. In some cases it is possible to do these computations without ever evaluating the likelihood.


Annals of Statistics | 2004

A stochastic process approach to false discovery control

Christopher R. Genovese; Larry Wasserman

Abstract The Bayes factor is a ratio of two posterior normalizing constants, which may be difficult to compute. We compare several methods of estimating Bayes factors when it is possible to simulate observations from the posterior distributions, via Markov chain Monte Carlo or other techniques. The methods that we study are all easily applied without consideration of special features of the problem, provided that each posterior distribution is well behaved in the sense of having a single dominant mode. We consider a simulated version of Laplaces method, a simulated version of Bartlett correction, importance sampling, and a reciprocal importance sampling technique. We also introduce local volume corrections for each of these. In addition, we apply the bridge sampling method of Meng and Wong. We find that a simulated version of Laplaces method, with local volume correction, furnishes an accurate approximation that is especially useful when likelihood function evaluations are costly. A simple bridge sampli...


Annals of Statistics | 2012

High-dimensional semiparametric Gaussian copula graphical models

Han Liu; Fang Han; Ming Yuan; John D. Lafferty; Larry Wasserman

This paper extends the theory of false discovery rates (FDR) pioneered by Benjamini and Hochberg [J. Roy. Statist. Soc. Ser B 57 (1995) 289-300]. We develop a framework in which the False Discovery Proportion (FDP)-the number of false rejections divided by the number of rejections-is treated as a stochastic process. After obtaining the limiting distribution of the process, we demonstrate the validity of a class of procedures for controlling the False Discovery Rate (the expected FDP). We construct a confidence envelope for the whole FDP process. From these envelopes we derive confidence thresholds, for controlling the quantiles of the distribution of the FDP as well as controlling the number of false discoveries. We also investigate methods for estimating the p-value distribution.


American Journal of Human Genetics | 2006

Using Linkage Genome Scans to Improve Power of Association in Genome Scans

Kathryn Roeder; Silvi-Alin Bacanu; Larry Wasserman; Bernie Devlin

ing the Spearman’s rho and Kendall’s tau. We prove that the nonparanormal skeptic achieves the optimal parametric rates of convergence for both graph recovery and parameter estimation. This result suggests that the nonparanormal graphical models can be used as a safe replacement of the popular Gaussian graphical models, even when the data are truly Gaussian. Besides theoretical analysis, we also conduct thorough numerical simulations to compare the graph recovery performance of dierent estimators under both ideal and noisy settings. The proposed methods are then applied on a largescale genomic dataset to illustrate their empirical usefulness. The R package huge implementing the proposed methods is available on the Comprehensive R Archive Network: http://cran.r-project.org/.

Collaboration


Dive into the Larry Wasserman's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

John D. Lafferty

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Aarti Singh

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Han Liu

Princeton University

View shared research outputs
Top Co-Authors

Avatar

Kathryn Roeder

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Yen-Chi Chen

University of Washington

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Barnabás Póczos

Carnegie Mellon University

View shared research outputs
Researchain Logo
Decentralizing Knowledge