Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Mark J. Schervish is active.

Publication


Featured researches published by Mark J. Schervish.


Journal of the American Statistical Association | 2008

Covariance Tapering for Likelihood-Based Estimation in Large Spatial Data Sets

Cari G. Kaufman; Mark J. Schervish; Douglas W. Nychka

Maximum likelihood is an attractive method of estimating covariance parameters in spatial models based on Gaussian processes. But calculating the likelihood can be computationally infeasible for large data sets, requiring O(n3) calculations for a data set with n observations. This article proposes the method of covariance tapering to approximate the likelihood in this setting. In this approach, covariance matrixes are “tapered,” or multiplied element wise by a sparse correlation matrix. The resulting matrixes can then be manipulated using efficient sparse matrix algorithms. We propose two approximations to the Gaussian likelihood using tapering. One of these approximations simply replaces the model covariance with a tapered version, whereas the other is motivated by the theory of unbiased estimating equations. Focusing on the particular case of the Matérn class of covariance functions, we give conditions under which estimators maximizing the tapering approximations are, like the maximum likelihood estimator, strongly consistent. Moreover, we show in a simulation study that the tapering estimators can have sampling densities quite similar to that of the maximum likelihood estimator, even when the degree of tapering is severe. We illustrate the accuracy and computational gains of the tapering methods in an analysis of yearly total precipitation anomalies at weather stations in the United States.


The American Statistician | 1996

P Values: What They are and What They are Not

Mark J. Schervish

Abstract P values (or significance probabilities) have been used in place of hypothesis tests as a means of giving more information about the relationship between the data and the hypothesis than does a simple reject/do not reject decision. Virtually all elementary statistics texts cover the calculation of P values for one-sided and point-null hypotheses concerning the mean of a sample from a normal distribution. There is, however, a third case that is intermediate to the one-sided and point-null cases, namely the interval hypothesis, that receives no coverage in elementary texts. We show that P values are continuous functions of the hypothesis for fixed data. This allows a unified treatment of all three types of hypothesis testing problems. It also leads to the discovery that a common informal use of P values as measures of support or evidence for hypotheses has serious logical flaws.


The International Journal of Robotics Research | 2003

Path Planning for Robotic Demining: Robust Sensor-Based Coverage of Unstructured Environments and Probabilistic Methods

Ercan U. Acar; Howie Choset; Yangang Zhang; Mark J. Schervish

Demining and unexploded ordnance (UXO) clearance are extremely tedious and dangerous tasks. The use of robots bypasses the hazards and potentially increases the efficiency of both tasks. A first crucial step towards robotic mine/UXO clearance is to locate all the targets. This requires a path planner that generates a path to pass a detector over all points of a mine/UXO field, i.e., a planner that is complete.The current state of the art in path planning for mine/UXO clearance is to move a robot randomly or use simple heuristics. These methods do not possess completeness guarantees which are vital for locating all of the mines/UXOs. Using such random approaches is akin to intentionally using imperfect detectors. In this paper, we first overview our prior complete coverage algorithm and compare it with randomized approaches. In addition to the provable guarantees, we demonstrate that complete coverage achieves coverage in shorter time than random coverage. We also show that the use of complete approaches enables the creation of a filter to reject bad sensor readings, which is necessary for successful deployment of robots. We propose a new approach to handle sensor uncertainty that uses geometrical and topological features rather than sensor uncertainty models. We have verified our results by performing experiments in unstructured indoor environments. Finally, for scenarios where some a priori information about a minefield is available, we expedite the demining process by introducing a probabilistic method so that a demining robot does not have to perform exhaustive coverage.


The American Statistician | 1999

Bayes Factors: What They Are and What They Are Not

Michael Lavine; Mark J. Schervish

Abstract Bayes factors have been offered by Bayesians as alternatives to P values (or significance probabilities) for testing hypotheses and for quantifying the degree to which observed data support or conflict with a hypothesis. In an earlier article, Schervish showed how the interpretation of P values as measures of support suffers a certain logical flaw. In this article, we show how Bayes factors suffer that same flaw. We investigate the source of that problem and consider what are the appropriate interpretations of Bayes factors.


Journal of the American Statistical Association | 1990

State-Dependent Utilities

Mark J. Schervish; Teddy Seidenfeld; Joseph B. Kadane

Several axiom systems for preference among acts lead to the existence of a unique probability and a state-independent utility such that acts are ranked according to their expected utilities. These axioms have been used as a foundation for Bayesian decision theory and the subjective probability calculus. In this paper, we note that the uniqueness of the probability is relative to the choice of what counts as a constant outcome. Although it is sometimes clear what should be considered constant, there are many cases in which there are several possible choices. Each choice can lead to a different “unique” probability and utility. By focusing attention on state-dependent utilities, we determine conditions under which a truly unique probability and utility can be determined from an agent’s expressed preferences among acts. Suppose that an agent’s preference can be represented in terms of a probability P and a utility U. That is, the agent prefers one act to another if and only if the expected utility of the one act is higher than that of the other. There are many other equivalent representations in terms of probabilities Q, which are mutually absolutely continuous with P, and state-dependent utilities V, which differ from U by possibly different positive affine transformations in each state of nature. An example is described in which two different but equivalent state-independent utility representations exist for the same preference structure. What differs between the two representations is which acts count as constants. The acts involve receiving different amounts of one or the other of two currencies and the states are different exchange rates between the currencies. It is easy to see how it would not be possible for constant amounts of both currencies to simultaneously have constant values across the different states. Savage (Foundations of statistics. John Wiley, New York, 1954, sec. 5.5) discovered a situation in which two seemingly equivalent preference structures are represented by different pairs of probability and utility. Savage attributed the phenomenon to the construction of a “small world”. We show that the small world problem is just another example of two different, but equivalent, representations treating different acts as constants. Finally, we prove a theorem (similar to one of Karni, Decision making under uncertainty. Harvard University Press, Cambridge, 1985) that shows how to elicit a unique state-dependent utility and does not assume that there are prizes with constant value. To do this, we define a new hypothetical kind of act in which both the prize to be awarded and the state of nature are determined by an auxiliary experiment.


Journal of Computational and Graphical Statistics | 1992

On the Convergence of Successive Substitution Sampling

Mark J. Schervish; Bradley P. Carlin

Abstract The problem of finding marginal distributions of multidimensional random quantities has many applications in probability and statistics. Many of the solutions currently in use are very computationally intensive. For example, in a Bayesian inference problem with a hierarchical prior distribution, one is often driven to multidimensional numerical integration to obtain marginal posterior distributions of the model parameters of interest. Recently, however, a group of Monte Carlo integration techniques that fall under the general banner of successive substitution sampling (SSS) have proven to be powerful tools for obtaining approximate answers in a very wide variety of Bayesian modeling situations. Answers may also be obtained at low cost, both in terms of computer power and user sophistication. Important special cases of SSS include the “Gibbs sampler” described by Gelfand and Smith and the “IP algorithm” described by Tanner and Wong. The major problem plaguing users of SSS is the difficulty in asce...


Probability Theory and Related Fields | 1984

The Extent of Non-Conglomerability of Finitely Additive Probabilities

Mark J. Schervish; Teddy Seidenfeld; Joseph B. Kadane

SummaryAn arbitrary finitely additive probability can be decomposed uniquely into a convex combination of a countably additive probability and a purely finitely additive (PFA) one. The coefficient of the PFA probability is an upper bound on the extent to which conglomerability may fail in a finitely additive probability with that decomposition. If the probability is defined on a σ-field, the bound is sharp. Hence, non-conglomerability (or equivalently non-disintegrability) characterizes finitely as opposed to countably additive probability. Nonetheless, there exists a PFA probability which is simultaneously conglomerable over an arbitrary finite set of partitions.Neither conglomerability nor non-conglomerability in a given partition is closed under convex combinations. But the convex combination of PFA ultrafilter probabilities, each of which cannot be made conglomerable in a common margin, is singular with respect to any finitely additive probability that is conglomerable in that margin.


Journal of the American Statistical Association | 1996

Reasoning to a Foregone Conclusion

Joseph B. Kadane; Mark J. Schervish; Teddy Seidenfeld

Abstract When can a Bayesian select an hypothesis H and design an experiment (or a sequence of experiments) to make certain that, given the experimental outcome(s), the posterior probability of H will be greater than its prior probability? We discuss an elementary result that establishes sufficient conditions under which this reasoning to a foregone conclusion cannot occur. We illustrate how when the sufficient conditions fail, because probability is finitely but not countably additive, it may be that a Bayesian can design an experiment to lead his/her posterior probability into a foregone conclusion. The problem has a decision theoretic version in which a Bayesian might rationally pay not to see the outcome of certain cost-free experiments, which we discuss from several perspectives. Also, we relate this issue in Bayesian hypothesis testing to various concerns about “optional stopping.”


Archive | 1990

Decisions Without Ordering

Teddy Seidenfeld; Mark J. Schervish; Joseph B. Kadane

We review the axiomatic foundations of subjective utility theory with a view toward understanding the implications of each axiom. We consider three different approaches, namely, the construction of utilities in the presence of canonical probabilities, the construction of probabilities in the presence of utilities, and the simultaneous construction of both probabilities and utilities. We focus attention on the axioms of independence and weak ordering. The independence axiom is seen to be necessary in order to prevent a form of Dutch Book in sequential problems.


Annals of Probability | 2001

Improper regular conditional distributions

Teddy Seidenfeld; Mark J. Schervish; Joseph B. Kadane

Improper regular conditional distributions (rcds) given a σ-field A have the following anomalous property. For sets A ∈ A, Pr(A |A) is not always equal to the indicator of A. Such a property makes the conditional probability puzzling as a representation of uncertainty. When rcds exist and the σ-field A is countably generated, then almost surely the rcd is proper. We give sufficient conditions for an red to be improper in a maximal sense, and show that these conditions apply to the tail σ-field and the σfield of symmetric events.

Collaboration


Dive into the Mark J. Schervish's collaboration.

Top Co-Authors

Avatar

Teddy Seidenfeld

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Joseph B. Kadane

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Mitchell J. Small

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Daniel J. McDonald

Indiana University Bloomington

View shared research outputs
Top Co-Authors

Avatar

Howie Choset

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Ercan U. Acar

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

William F. Eddy

Carnegie Mellon University

View shared research outputs
Researchain Logo
Decentralizing Knowledge