Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Alan J. Miller is active.

Publication


Featured researches published by Alan J. Miller.


Journal of the Royal Statistical Society. Series A (General) | 1984

Selection of Subsets of Regression Variables

Alan J. Miller

On etudie divers algorithmes de calcul en considerant seulement les modeles lineaires et le critere des moindres carres


Applied statistics | 1994

A Fedorov Exchange Algorithm for D-optimal Design

Alan J. Miller; Nam-Ky Nguyen

Optimal design algorithms are particularly useful for creating designs in difficult situations, such as when experimental conditions enforce inconvenient block sizes. Several packages are now available for computer-aided design of experiments on personal computers (PCs) either in the form of Fortran source code or as easy-to-use commercial packages. Most of these packages could not find a good design for say a 23x 32 X 14 experiment in blocks of size 10. This kind of requirement is not unusual where there are say 14 species of timber or types of cheese, and the block size is dictated by the number of experimental runs which can be carried out in one day or from one batch of material. Our algorithm allows blocking, including the use of unequal block sizes. Another important application is in augmenting an experiment which has already been carried out. Our algorithm allows some of the candidate points, those in the experiment which has been carried out, to be forced into the design. Let X be an N x k matrix containing N possible or candidate design points. We want to find a subset of n out of the N points which maximizes the determinant D = IXnXn I


Applied statistics | 1992

Algorithm AS 274: Least Squares Routines to Supplement Those of Gentleman

Alan J. Miller

Gentleman (1974) gave some routines (algorithm AS 75) in Algol for linear least squares calculations; Griffiths and Hill (1985) contains a translation into Fortran. They do not include routines for changing the order of variables, for handling singularities, for calculating correlations or partial correlations from the Cholesky factorization, for calculating regression coefficients for a subset of the variables or for calculating the estimated covariances of such regression coefficients. Clarke (1981) provided an algorithm for changing the order of variables, but it requires the elements of the Cholesky factorization to be stored in a different order from Gentlemans, and it overwrites the implicit ls on the diagonal of the upper triangular factor with the row multipliers. The Cholesky factor is represented here as D112R where D is a diagonal matrix of row multipliers and R is an upper triangular matrix with ls on its diagonal. In the Fortran code, the diagonal of D is stored in arrayD and R is stored by rows, excluding its diagonal elements, in the one-dimensional array RBAR. This is exactly the same as in algorithm AS 75. This set of routines can be used with orthogonal reductions produced by using Gentlemans routines, or on their own. The basic algorithm used for updating or changing the order of variables is that used by Gentleman, though the present author has used the so-called fast planar rotation method described by Hammarling (1974) for many years. The Hammarling algorithm is faster for large numbers of variables, but its diagonal multipliers need fairly frequent rescaling, and this produces longer code and partially destroys the speed advantage. For compatibility with algorithm AS 75, the Hammarling algorithm has not been used here. The algorithm used by Gentleman is not well suited when there are more variables than observations. For instance, in near infra-red spectroscopy it is common to have say 500 variables (wavelengths) but only say 30 cases. Algorithm AS 75 requires the storage of the whole upper triangle of 124 750 elements, rather than just the first 30 rows of non-zero values which requires about 15 000 storage locations. If the user is confident that there are no linear dependencies among any 30 of the variables, then


The American Statistician | 1996

The Convergence of Efroymson's Stepwise Regression Algorithm

Alan J. Miller

Abstract The stepwise regression algorithm that is widely used is due to Efroymson. He stated that the F-to-remove value had to be not greater than the F-to-enter value, but did not show that the algorithm could not cycle. Until now nobody appears to have shown this. To prove that the algorithm does converge, an objective function is introduced. It is shown that this objective function decreases or can occasionally remain constant at each step in the algorithm, and hence the algorithm cannot cycle provided that Efroymsons condition is satisfied.


Journal of Statistical Planning and Inference | 1997

2m fractional factorial designs of resolution V with high A-efficiency, 7 ⩽ m ⩽ 10

Nam-Ky Nguyen; Alan J. Miller

Abstract We present 111 2 m fractional factorial designs of resolution V for 7 ⩽ m ⩽ 10. These designs are the best known to the authors with respect to the A -optimality criterion (as of October 1995).


Archive | 1996

Estimation After Model Building: A First Step

Alan J. Miller

Suppose that a set of data is used to build a model and the same data are then used to estimate parameters in the model. If classical methods such as least squares or maximum likelihood are used to estimate the parameters as if the model had been decided a priori then the parameter estimates will be biassed. Only regression subset selection procedures are considered here, but the same problems of over-fitting exist with most model building procedures. Chatfield (1995) has reviewed some of the problems of inference after model building, while Bancroft & Han (1977) give an extensive bibliography.


Journal of The Royal Statistical Society Series A-statistics in Society | 1992

Subset Selection in Regression.

Anthony B. Atkinson; Alan J. Miller

8. Subset Selection in Regression (Monographs on Statistics and Applied Probability, no. 40). By A. J. Miller. ISBN 0 412 35380 6. Chapman and Hall, London, 1990. 240 pp. £25.00.


Archive | 1990

Subset Selection in Regression

Alan J. Miller


Journal of the Royal Statistical Society | 1984

Sélection of subsets of regression variables

Alan J. Miller


Applied statistics | 1974

A Note on the Analysis of Gap-Acceptance in Traffic

Alan J. Miller

Collaboration


Dive into the Alan J. Miller's collaboration.

Top Co-Authors

Avatar

Nam-Ky Nguyen

Commonwealth Scientific and Industrial Research Organisation

View shared research outputs
Top Co-Authors

Avatar

Anthony B. Atkinson

London School of Economics and Political Science

View shared research outputs
Researchain Logo
Decentralizing Knowledge