Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Maurice G. Cox is active.

Publication


Featured researches published by Maurice G. Cox.


Metrologia | 2002

The evaluation of key comparison data

Maurice G. Cox

Guidelines containing two procedures are proposed for the evaluation of key comparison data. They apply to the simple circulation of a single travelling standard around all the participants. The application of the procedures to a specific set of key comparison data provides a key comparison reference value (KCRV) and the associated uncertainty, the degree of equivalence of the measurement made by each participating national institute and the degrees of equivalence between measurements made by all pairs of participating institutes. Procedure A is based on the use of the weighted mean, together with consistency checks based on classical statistics regarding its applicability. Should the checks fail, action to remedy the situation is suggested. If the remedy is inappropriate, Procedure B can be applied instead. It is based on the use of the median (or any another informed choice) as a more robust estimator in the circumstances.


Metrologia | 2006

The use of a Monte Carlo method for evaluating uncertainty and expanded uncertainty

Maurice G. Cox; Bernd R. L. Siebert

The Guide to the Expression of Uncertainty in Measurement (GUM) is the internationally accepted master document for the evaluation of uncertainty. It contains a procedure that is suitable for many, but not all, uncertainty evaluation problems met in practice. This procedure constitutes an approximation to the general solution of the Markov formula, which infers the probability density function (PDF) for the output quantities (measurands) from the model of the measurement and the PDFs for the input quantities. This paper shows that a Monte Carlo method is an effective and versatile tool for determining the PDF for the measurands. This method provides a consistent Bayesian approach to the evaluation of uncertainty. Although in principle straightforward, some care is required in representing and validating the results obtained using the method. The paper provides guidance on optimizing the approach, identifies some pitfalls and indicates means for validating the results.


Metrologia | 2007

The evaluation of key comparison data: determining the largest consistent subset

Maurice G. Cox

Suppose a single stable travelling standard is circulated around the national metrology institutes (NMIs) participating in a key comparison. Consider the set of data consisting of a measurement result, comprising a measured value and the associated standard uncertainty, provided independently by each such NMI. Each measured value is the corresponding NMIs best estimate of a single stipulated property of the standard. The weighted mean (WM) of the measured values can be formed, the weights being proportional to the reciprocals of the squared standard uncertainties. If this WM is consistent with the measured values according to a statistical test, it can be accepted as a key comparison reference value for the comparison. Otherwise, the WM of a largest consistent subset (LCS) can be determined. The LCS contains as many as possible of those results of participating NMIs that are consistent with the WM of that subset. An efficient approach for determining the LCS having smallest chi-squared value is described, and applied to length, temperature and ionizing radiation comparisons.


Measurement Science and Technology | 2006

Measurement uncertainty and traceability

Maurice G. Cox; Peter M. Harris

Obtaining confidence in a measured value requires a quantitative statement of its quality, which in turn necessitates the evaluation of the uncertainty associated with the value. The basis for the value and the associated uncertainty is traceability of measurement, involving the relationship of relevant quantities to national or international standards through an unbroken chain of measurement comparisons. Each comparison involves calibration of a standard at one level in the chain using a standard at a higher level. Global economy considerations mean that this basis also requires the national measurement institutes to carry out comparative assessment of the degree of equivalence of national standards through their participation in key comparisons. The evaluation of uncertainty of measurement is founded on the use of models of measurement for each stage of the chain and at the highest level to interrelate national standards. Basic aspects of uncertainty evaluation are covered in this paper, and forms for the above types of model considered, with attention given to least squares as a basis for calibration curves (and certain other types of calibration) and also for key comparison data evaluation.


Metrologia | 2007

The area under a curve specified by measured values

Maurice G. Cox

The problem is addressed of determining numerical approximations to the area under a curve specified by arbitrarily spaced data. A formulation of this problem is given in which the data are used to model the curve as a piecewise polynomial, each piece having the same degree. That piecewise function is integrated to provide an approximation to the area. A corresponding compound quadrature rule is derived. Different degrees of polynomial give rise to different orders of quadrature rule. The widely used trapezoidal rule is a special case, as is the Gill–Miller rule. The remainder of the paper is concerned with evaluating the measurement uncertainties associated with the approximations to the area obtained by the use of these rules in the case where the data ordinates correspond to measured values having stated associated uncertainties. The case in which there is correlation associated with the measured values, frequently arising when the measured values are obtained using the same measuring instrument, is also treated. A statistical test is used to select a suitable polynomial degree.


Metrologia | 2006

The generalized weighted mean of correlated quantities

Maurice G. Cox; Christopher Eio; Giovanni Mana; Francesca Pennecchi

An analysis is presented of the generalized weighted mean of a set of quantities, some or all of which may be correlated, and the variance associated with that mean. The cases covered specifically are (a) two correlated quantities, (b) a general number of quantities, two of which are correlated and (c) a general number of quantities, all of which are correlated. For (c), two instances are treated, namely where the quantity variances are equal and unequal. Consistency of the data representing realizations of the quantities concerned is shown to be important in terms of appreciating the results obtained.


Metrologia | 2011

Spectrometer bandwidth correction for generalized bandpass functions

Emma Woolliams; Réjean Baribeau; Agnieszka Bialek; Maurice G. Cox

The bandpass of spectrometers can cause appreciable errors when making radiometric measurements. This paper describes a practical method for correcting a set of equispaced measured values provided by a spectrometer with a finite bandwidth, an arbitrary bandpass function and at an arbitrary wavelength step. The paper reviews the limits of the approach for real spectra in the presence of measurement noise and suggests ways of reducing the effect of noise.


Metrologia | 2004

The use of a mixture of probability distributions in temperature interlaboratory comparisons

Patrizia Ciarlini; Maurice G. Cox; Franco Pavese; Giuseppe Regoliosi

Several studies highlight the need for appropriate statistical and probabilistic tools to analyse the data provided by the participants in an interlaboratory comparison. In some temperature comparisons, where the measurand is a physical state, independent realizations of the same physical state are acquired in each participating institute, which should be considered as belonging to a single super-population. This paper introduces the use of a probabilistic tool, a mixture of probability distributions, to represent the overall population in such a temperature comparison. This super-population is defined by combining the local populations in given proportions. The mixture density function identifies the total data variability, and the key comparison reference value has a natural definition as the expectation value of this probability density.


Metrologia | 1994

Uncertainty Modelling in Mass Comparisons

Walter Bich; Maurice G. Cox; Peter M. Harris

To evaluate uncertainty in mass measurements with accuracy and in accordance with recent international documents, a general model is developed, which takes account of the various contributing quantities in a multivariate context. On this basis, the variance-covariance matrix of the in-vacuo mass differences is constructed in its general form and tailored for application to some of the most commonly adopted weighing methods. The usual assumption of equal-variance, uncorrelated observations is shown to be inappropriate for mass comparisons.


Metrologia | 2008

A probabilistic approach to the analysis of measurement processes

Maurice G. Cox; Giovanni Battista Rossi; Peter M. Harris; Alistair Forbes

We consider a probabilistic model of the measurement process, based on identifying two main sub-processes, named observation and restitution. Observation constitutes the transformations involved in producing the observable output. Restitution constitutes the determination of the measurand (the quantity measured) from the observable output, and includes data processing. After providing a probabilistic representation of the observation sub-process, we derive appropriate formulae for addressing restitution and describing the overall measurement process. The model allows the treatment in probabilistic terms of both the random and systematic effects that influence the measurement process, and may prove particularly useful in the formulation phase of uncertainty evaluation. We also discuss the different ways in which the measurand can be characterized by a probability distribution, and demonstrate the application of the approach to the analysis of risk in conformance testing.

Collaboration


Dive into the Maurice G. Cox's collaboration.

Top Co-Authors

Avatar

Peter M. Harris

National Physical Laboratory

View shared research outputs
Top Co-Authors

Avatar

Alistair Forbes

National Physical Laboratory

View shared research outputs
Top Co-Authors

Avatar

Emma Woolliams

National Physical Laboratory

View shared research outputs
Top Co-Authors

Avatar

Lena Johansson

National Physical Laboratory

View shared research outputs
Top Co-Authors

Avatar

Martin J. T. Milton

National Physical Laboratory

View shared research outputs
Top Co-Authors

Avatar

T J Esward

National Physical Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

I M Smith

National Physical Laboratory

View shared research outputs
Top Co-Authors

Avatar

Neil J. Harrison

National Physical Laboratory

View shared research outputs
Top Co-Authors

Avatar

Nigel P. Fox

National Physical Laboratory

View shared research outputs
Researchain Logo
Decentralizing Knowledge