Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where H. Joseph Newton is active.

Publication


Featured researches published by H. Joseph Newton.


The Economic Journal | 1988

Timeslab: A Time Series Analysis Laboratory

H. Joseph Newton

TIMESLAB: a time series analysis laboratory , TIMESLAB: a time series analysis laboratory , مرکز فناوری اطلاعات و اطلاع رسانی کشاورزی


Technometrics | 1982

Using Periodic Autoregressions for Multiple Spectral Estimation

H. Joseph Newton

A new method of estimating the spectral density of a multiple time series based on the concept of periodically stationary autoregressive processes is described and illustrated. It is shown that the method can often overcome some difficulties inherent in the traditional smoothed periodogram and autoregressive spectral-estimation methods and that additional insights into the structure of a multiple time series can be obtained by using periodic autoregressions.


Journal of the American Statistical Association | 1983

A Method for Determining Periods in Time Series

H. Joseph Newton; Marcello Pagano

Abstract The periods corresponding to peaks in the autoregressive and window spectral density estimators are shown to be consistent and asymptotically normal estimators of peak periods in the true spectral density under assumptions of known autoregressive order and under slight modifications of the assumptions often made to show the asymptotic normality of window estimators, respectively. The asymptotic variances are obtained, and the use of the theory is illustrated by obtaining an asymptotic confidence interval for a peak period using a set of hormone data.


Siam Journal on Scientific and Statistical Computing | 1983

The Finite Memory Prediction of Covariance Stationary Time Series

H. Joseph Newton; Marcello Pagano

An algorithm is presented for conveniently calculating h step ahead minimum mean square error linear predictors and prediction variances given a finite number of observations from a covariance stationary time series Y. It is shown that elements of the modified Cholesky decomposition of the covariance matrix of observations play the role in finite memory prediction that the coefficients in the infinite order moving average representation of Y play in infinite memory prediction. A by-product of the algorithm is the extension of Paganos result (J. Assoc. Comput. Mach., 23 (1976), pp. 310–316) on the convergence down the diagonal of the Cholesky factors of a banded Toeplitz matrix to a similar result for a general Toeplitz matrix. This result is applied to autoregressive-moving average time series. A numerical example illustrating the results of the paper is presented.


Communications in Statistics - Simulation and Computation | 1995

Using symbolic math to evaluate saddlepoint approximations for the difference of order statistics

Jane L. Harvill; H. Joseph Newton

We show how symbolic math can be used to calculate saddlepoint approximations for the difference of order statistics from any continuous parent distribution. The difference of order statistics is commonly used as a test statistic in nonparametric tests for constancy of a process. The usefulness of the program is shown by examining the steps required to obtain saddlepoint approximations for a single distribution. A sample session of the a program is presented for the interquartile range of a sample of size 100 from an exponential distribution with mean 2. The results are graphically compared to the true density.


The American Statistician | 2000

StatConcepts: A Visual Tour of Statistical Ideas

Marcello Pagano; H. Joseph Newton; Jane L. Harvill

StatConcepts: A Visual Tour of Statistical Ideas is a collection of programs written in the language of StataQuest. There are twenty-eight labs providing students a powerful tool for graphical, interactive exploration of statistical concepts. StatConcepts is not intended as a text, but instead as a supplement. The focus of StatConcepts is the correct interpretation and understanding of statistical concepts, terminology, and results, and is not on computation. Although the labs are intended primarily for introductory statistics courses, they can be valuable in courses at all levels.


Archive | 1981

On Some Numerical Properties of Arma Parameter Estimation Procedures

H. Joseph Newton

This paper reviews the algorithms used by statisticians for obtaining efficient estimators of the parameters of a univariate autoregressive moving average (ARMA) time series. The connection of the estimation problem with the problem of prediction is investigated with particular emphasis on the Kalman filter and modified Cholesky decomposition algorithms. A result from prediction theory is given which provides a significant reduction in the computations needed in Ansley’s (1979) estimation procedure. Finally it is pointed out that there are many useful facts in the literature of control theory that need to be investigated by statisticians interested in estimation and prediction problems in linear time series models.


Chemometrics and Intelligent Laboratory Systems | 1991

S-PLUS for Unix and DOS

H. Joseph Newton

Abstract The New S language is a remarkably effective tool for performing scientific data analysis. S-PLUS, an expanded version of New S, has significant features that are missing in New S, including nonlinear optimization, several multivariate analysis and time series techniques, to name but a few. The reader should be aware however that S-PLUS does require some effort to use. It is command-driven as opposed to menu-driven and does require learning some arguments of some functions. The small amount of effort to use it is greatly rewarded every time it is used not only to analyze data but to create remarkably good graphics for presentations and publication.


Journal of Statistical Computation and Simulation | 1990

A recursive in order algorithm for least squares estimates of an autoregressive process

Katheerine B. Ensor; H. Joseph Newton

An algorithm which is recursive in the order is presented to compute the least squares estimates for autoregressive models of order 1,...,M. The algorithm requires 12nM + 28M2 multiplications and additions, where n is the length of the observed series, to obtain the estimates for all M models. While other similar recursive algorithms have recently been developed, the algorithm proposed in this paper can be easily described in terms of adding and deleting rows and columns of a sequence of regression design matrices. The proof of the algorithm is based on three general regression theorems, one of which we call “Cybenkos Theorem” which provides the tools for obtaining a new set of estimates from the previous estimates. The remaining two theorems provide the tools for deleting observations from various regressions. The algorithm is computationally efficient and the proof is straight forward and understandable.


Journal of Statistical Computation and Simulation | 1981

On the stationarity of multiple autoregressive approximants: theory and algorithms

H. Joseph Newton; Marcello Pagano

Numerical methods are presented for determining the stability of a multiple autoregressive scheme. The question of stability is important in prediction theory, and the methods for determining stability are such that they yield information which is very useful for fitting time series models to data.

Collaboration


Dive into the H. Joseph Newton's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Edward I. George

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tuanfeng Li

Beijing University of Chemical Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge