Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Peter Hall is active.

Publication


Featured researches published by Peter Hall.


Annals of Statistics | 2006

Properties of principal component methods for functional and longitudinal data analysis

Peter Hall; Hans-Georg Müller; Jane-Ling Wang

The use of principal component methods to analyze functional data is appropriate in a wide range of different settings. In studies of functional data analysis, it has often been assumed that a sample of random functions is observed precisely, in the continuum and without noise. While this has been the traditional setting for functional data analysis, in the context of longitudinal data analysis a random function typically represents a patient, or subject, who is observed at only a small number of randomly distributed points, with nonnegligible measurement error. Nevertheless, essentially the same methods can be used in both these cases, as well as in the vast number of settings that lie between them. How is performance affected by the sampling plan? In this paper we answer that question. We show that if there is a sample of n functions, or subjects, then estimation of eigenvalues is a semiparametric problem, with root-n consistent estimators, even if only a few observations are made of each function, and if each observation is encumbered by noise. However, estimation of eigenfunctions becomes a nonparametric problem when observations are sparse. The optimal convergence rates in this case are those which pertain to more familiar function-estimation settings. We also describe the effects of sampling at regularly spaced points, as opposed to random points. In particular, it is shown that there are often advantages in sampling randomly. However, even in the case of noisy data there is a threshold sampling rate (depending on the number of functions treated) above which the rate of sampling (either randomly or regularly) has negligible impact on estimator performance, no matter whether eigenfunctions or eigenvectors are being estimated.


Annals of Statistics | 2010

INNOVATED HIGHER CRITICISM FOR DETECTING SPARSE SIGNALS IN CORRELATED NOISE

Peter Hall; Jiashun Jin

Higher Criticism is a method for detecting signals that are both sparse and weak. Although rst proposed in cases where the noise variables are independent, Higher Criticism also has reasonable performance in settings where those variables are correlated. In this paper we show that, by exploiting the nature of the correlation, performance can be improved by using a modied approach which exploits the potential advantages that correlation has to oer. Indeed, it turns out that the case of independent noise is the most dicult of all, from a statistical viewpoint, and that more accurate signal detection (for a given level of signal sparsity and strength) can be obtained when correlation is present. We characterize the advantages of correlation by showing how to incorporate them into the denition of an optimal detection boundary. The boundary has particularly attractive properties when correlation decays at a polynomial rate or the correlation matrix is Toeplitz.


Proceedings of SPIE | 2004

The infrared spectrograph on the Spitzer Space Telescope

James R. Houck; Thomas L. Roellig; Jeff Van Cleve; William J. Forrest; Terry L. Herter; C. R. Lawrence; Keith Matthews; Harold J. Reitsema; B. T. Soifer; Dan M. Watson; D. Weedman; Marty Huisjen; John R. Troeltzsch; D. J. Barry; J. Bernard-Salas; Craig Blacken; Bernhard R. Brandl; V. Charmandaris; D. Devost; G. E. Gull; Peter Hall; Charles P. Henderson; S. James U. Higdon; Bruce Pirger; Justin Schoenwald; Greg C. Sloan; Keven Isao Uchida; Philip N. Appleton; Lee Armus; M. J. Burgdorf

The Infrared Spectrograph (IRS) is one of three science instruments on the Spitzer Space Telescope. The IRS comprises four separate spectrograph modules covering the wavelength range from 5.3 to 38 μm with spectral resolutions, R~90 and 650, and it was optimized to take full advantage of the very low background in the space environment. The IRS is performing at or better than the pre-launch predictions. An autonomous target acquisition capability enables the IRS to locate the mid-infrared centroid of a source, providing the information so that the spacecraft can accurately offset that centroid to a selected slit. This feature is particularly useful when taking spectra of sources with poorly known coordinates. An automated data reduction pipeline has been developed at the Spitzer Science Center.


Annals of Statistics | 2008

PROPERTIES OF HIGHER CRITICISM UNDER STRONG DEPENDENCE

Peter Hall; Jiashun Jin

The problem of signal detection using sparse, faint information is closely related to a variety of contemporary statistical problems, including the control of false-discovery rate, and classification using very high-dimensional data. Each problem can be solved by conducting a large number of simultaneous hypothesis tests, the properties of which are readily accessed under the assumption of independence. In this paper we address the case of dependent data, in the context of higher criticism methods for signal detection. Short-range dependence has no first-order impact on performance, but the situation changes dramatically under strong dependence. There, although higher criticism can continue to perform well, it can be bettered using methods based on differences of signal values or on the maximum of the data. The relatively inferior performance of higher criticism in such cases can be explained in terms of the fact that, under strong dependence, the higher criticism statistic behaves as though the data were partitioned into very large blocks, with all but a single representative of each block being eliminated from the dataset.


Annals of Statistics | 2013

A simple bootstrap method for constructing nonparametric confidence bands for functions

Peter Hall; Joel L. Horowitz

Standard approaches to constructing nonparametric confidence bands for functions are frustrated by the impact of bias, which generally is not estimated consistently when using the bootstrap and conventionally smoothed function estimators. To overcome this problem it is common practice to either undersmooth, so as to reduce the impact of bias, or oversmooth, and thereby introduce an explicit or implicit bias estimator. However, these approaches, and others based on nonstandard smoothing methods, complicate the process of inference, for example by requiring the choice of new, unconventional smoothing parameters and, in the case of undersmoothing, producing relatively wide bands. In this paper we suggest a new approach, which exploits to our advantage one of the difficulties that, in the past, has prevented an attractive solution to this problem - the fact that the standard bootstrap bias estimator suffers from relatively high-frequency stochastic error. The high frequency, together with a technique based on quantiles, can be exploited to dampen down the stochastic error term, leading to relatively narrow, simple-to-construct confidence bands.


Journal of the American Statistical Association | 2009

Nonparametric Prediction in Measurement Error Models

Raymond J. Carroll; Aurore Delaigle; Peter Hall

Predicting the value of a variable Y corresponding to a future value of an explanatory variable X, based on a sample of previously observed independent data pairs (X1, Y1), …, (Xn, Yn) distributed like (X, Y), is very important in statistics. In the error-free case, where X is observed accurately, this problem is strongly related to that of standard regression estimation, since prediction of Y can be achieved via estimation of the regression curve E(Y|X). When the observed Xis and the future observation of X are measured with error, prediction is of a quite different nature. Here, if T denotes the future (contaminated) available version of X, prediction of Y can be achieved via estimation of E(Y|T). In practice, estimating E(Y|T) can be quite challenging, as data may be collected under different conditions, making the measurement errors on Xi and X nonidentically distributed. We take up this problem in the nonparametric setting and introduce estimators which allow a highly adaptive approach to smoothing. Reflecting the complexity of the problem, optimal rates of convergence of estimators can vary from the semiparametric n−1/2 rate to much slower rates that are characteristic of nonparametric problems. Nevertheless, we are able to develop highly adaptive, data-driven methods that achieve very good performance in practice. This article has the supplementary materials online. Acknowledgments: Carrolls research was supported by grants from the National Cancer Institute (CA57030, CA104620). Delaigles work was partially supported by a fellowship from the Maurice Belz foundation. Halls work was partially supported by the Australian Reserach Council and by a grant from the National Science Foundation (DMS 0604698).


Journal of the American Statistical Association | 2012

Comment: Robustness to Assumption of Normally Distributed Errors

Aurore Delaigle; Peter Hall

Breiman, L. (2001), “Statistical Modeling: The Two Cultures,” Statistical Science, 16, 199–215. [1035] Genovese, C., and Wasserman, L. (2006), “Exceedance Control of the False Discovery Proportion,” Journal of the American Statistical Association, 101, 1408–1417. [1035] Romano, J., and Wolf, M. (2007), “Control of Generalized Error Rates in Multiple Testing,” The Annals of Statistics, 35, 1378–1408. [1035] van der Laan, M. J., Dudoit, S., and Pollard, K. S. (2004), “Augmentation Procedures for Control of the Generalized Family-Wise Error Rate and Tail Probabilities for the Proportion of False Positives,” Statistical Applications in Genetics and Molecular Biology, 3 (1), DOI: 10.2202/15446115.1042. Available online at http://www.degruyter.com/view/j/sagmb. 2004.3.1/sagmb.2004.3.1.1042/sagmb.2004.3.1.1042.xml?format=INT [1035]


Journal of The Royal Statistical Society Series B-statistical Methodology | 2004

Low order approximations in deconvolution and regression with errors in variables

Raymond J. Carroll; Peter Hall


Journal of The Royal Statistical Society Series B-statistical Methodology | 2011

Robustness and accuracy of methods for high dimensional data analysis based on Student's t‐statistic

Aurore Delaigle; Peter Hall; Jiashun Jin


Journal of The Royal Statistical Society Series B-statistical Methodology | 2007

Non-parametric regression estimation from data contaminated by a mixture of berkson and classical errors

Raymond J. Carroll; Aurore Delaigle; Peter Hall

Collaboration


Dive into the Peter Hall's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jiashun Jin

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

B. T. Soifer

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

C. R. Lawrence

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge