Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Peter S. Craig is active.

Publication


Featured researches published by Peter S. Craig.


Journal of the American Statistical Association | 2001

Bayesian Forecasting for Complex Systems Using Computer Simulators

Peter S. Craig; Michael Goldstein; Jonathan Rougier; Allan Seheult

Although computer models are often used for forecasting future outcomes of complex systems, the uncertainties in such forecasts are not usually treated formally. We describe a general Bayesian approach for using a computer model or simulator of a complex system to forecast system outcomes. The approach is based on constructing beliefs derived from a combination of expert judgments and experiments on the computer model. These beliefs, which are systematically updated as we make runs of the computer model, are used for either Bayesian or Bayes linear forecasting for the system. Issues of design and diagnostics are described in the context of forecasting. The methodology is applied to forecasting for an active hydrocarbon reservoir.


The American Statistician | 1991

Dynamic Graphics for Exploring Spatial Data with Application to Locating Global and Local Anomalies

John Haslett; Ronan Bradley; Peter S. Craig; Antony Unwin; Graham Wills

Abstract We explore the application of dynamic graphics to the exploratory analysis of spatial data. We introduce a number of new tools and illustrate their use with prototype software, developed at Trinity College, Dublin. These tools are used to examine local variability—anomalies—through plots of the data that display its marginal and multivariate distributions, through interactive smoothers, and through plots motivated by the spatial auto-covariance ideas implicit in the variogram. We regard these as alternative and linked views of the data. We conclude that the most important single view of the data is the Map View: All other views must be cross-referred to this, and the software must encourage this. The view can be enriched by overlaying on other pertinent spatial information. We draw attention to the possibilities of one-many linking, and to the use of line-objects to link pairs of data points. We draw attention to the parallels with work on Geographical Information Systems.


The Statistician | 1998

Constructing partial prior specifications for models of complex physical systems

Peter S. Craig; Michael Goldstein; Allan Seheult; James A. Smith

Summary. Many large scale problems, particularly in the physical sciences, are solved using complex, high dimensional models whose outputs, for a given set of inputs, are expensive and time consuming to evaluate. The complexity of such problems forces us to focus attention on those limited aspects of uncertainty which are directly relevant to the tasks for which the model will be used. We discuss methods for constructing the relevant partial prior specifications for these uncertainties, based on the prior covariance structure. Our approach combines two sources of prior knowledge. First, we elicit both qualitative and quantitative prior information based on expert prior judgments, using computer-based elicitation tools for organizing the complex collection of assessments in a systematic way. Secondly, we test and refine these judgments using detailed experiments based on versions of the model which are cheaper to evaluate. Although the approach is quite general, we illustrate it in the context of matching hydrocarbon reservoir history.


Proceedings of the Institution of Mechanical Engineers, Part H: Journal of Engineering in Medicine | 1995

Measurement of wear in retrieved acetabular sockets.

R M Hall; A Unsworth; Peter S. Craig; C Hardaker; P Siney; B M Wroblewski

Twenty-eight standard Charnley sockets were retrieved at revision surgery. The penetration angle with respect to the cup coordinate system, β, and penetration depth, d, of the sockets were measured using both the traditional shadowgraph technique and by analysing data obtained from a coordinate measuring machine (CMM). In addition, d was deduced radiographically from pre-revision X-rays. Limits of agreement between the three methods of measuring d were of the order of ±0.5 mm. Using the data obtained from the CMM it was possible to deduce the wear volume Vmeas directly. It was found that, in general, values of the wear volume calculated from d and β using equations cited elsewhere (1, 2) were both imprecise and inaccurate. The direct measurement of the wear volume using the CMM depends on the location of reference points external to the wear surface. If such surfaces were damaged, then it was concluded that the shadowgraph technique provided the most suitable method for measuring the dimensional changes in the retrieved socket, due to its relative ease of use.


Environmental Toxicology and Chemistry | 2008

Making species salinity sensitivity distributions reflective of naturally occurring communities: Using rapid testing and Bayesian statistics

Graeme L. Hickey; Ben J. Kefford; Jason E. Dunlop; Peter S. Craig

Species sensitivity distributions (SSDs) may accurately predict the proportion of species in a community that are at hazard from environmental contaminants only if they contain sensitivity data from a large sample of species representative of the mix of species present in the locality or habitat of interest. With current widely accepted ecotoxicological methods, however, this rarely occurs. Two recent suggestions address this problem. First, use rapid toxicity tests, which are less rigorous than conventional tests, to approximate experimentally the sensitivity of many species quickly and in approximate proportion to naturally occurring communities. Second, use expert judgements regarding the sensitivity of higher taxonomic groups (e.g., orders) and Bayesian statistical methods to construct SSDs that reflect the richness (or perceived importance) of these groups. Here, we describe and analyze several models from a Bayesian perspective to construct SSDs from data derived using rapid toxicity testing, combining both rapid test data and expert opinion. We compare these new models with two frequentist approaches, Kaplan-Meier and a log-normal distribution, using a large data set on the salinity sensitivity of freshwater macroinvertebrates from Victoria (Australia). The frequentist log-normal analysis produced a SSD that overestimated the hazard to species relative to the Kaplan-Meier and Bayesian analyses. Of the Bayesian analyses investigated, the introduction of a weighting factor to account for the richness (or importance) of taxonomic groups influenced the calculated hazard to species. Furthermore, Bayesian methods allowed us to determine credible intervals representing SSD uncertainty. We recommend that rapid tests, expert judgements, and novel Bayesian statistical methods be used so that SSDs reflect communities of organisms found in nature.


EFSA Journal | 2017

Guidance on dermal absorption

Harrie Buist; Peter S. Craig; Ian Dewhurst; Susanne Hougaard Bennekou; Carsten Kneuer; Kyriaki Machera; Christina Pieper; Daniele Court Marques; Gilles Guillot; Federica Ruffo; Arianna Chiusolo

Abstract This guidance on the assessment of dermal absorption has been developed to assist notifiers, users of test facilities and Member State authorities on critical aspects related to the setting of dermal absorption values to be used in risk assessments of active substances in Plant Protection Products (PPPs). It is based on the ‘scientific opinion on the science behind the revision of the guidance document on dermal absorption’ issued in 2011 by the EFSA Panel on Plant Protection Products and their Residues (PPR). The guidance refers to the EFSA PPR opinion in many instances. In addition, the first version of this guidance, issued in 2012 by the EFSA PPR Panel, has been revised in 2017 on the basis of new available data on human in vitro dermal absorption for PPPs and wherever clarifications were needed. Basic details of experimental design, available in the respective test guidelines and accompanying guidance for the conduct of studies, have not been addressed but recommendations specific to performing and interpreting dermal absorption studies with PPPs are given. Issues discussed include a brief description of the skin and its properties affecting dermal absorption. To facilitate use of the guidance, flow charts are included. Guidance is also provided, for example, when there are no data on dermal absorption for the product under evaluation. Elements for a tiered approach are presented including use of default values, data on closely related products, in vitro studies with human skin (regarded to provide the best estimate), data from experimental animals (rats) in vitro and in vivo, and the so called ‘triple pack’ approach. Various elements of study design and reporting that reduce experimental variation and aid consistent interpretation are presented. A proposal for reporting data for assessment reports is also provided. The issue of nanoparticles in PPPs is not addressed. Data from volunteer studies have not been discussed since their use is not allowed in EU for risk assessment of PPPs.


Science of The Total Environment | 2015

Towards a landscape scale management of pesticides: ERA using changes in modelled occupancy and abundance to assess long-term population impacts of pesticides.

Chris J. Topping; Peter S. Craig; Frank de Jong; Michael Klein; Ryszard Laskowski; Barbara Manachini; Silvia Pieper; Robert Smith; José Paulo Sousa; Franz Streissl; Klaus Swarowsky; A. Tiktak; Ton Van Der Linden

Pesticides are regulated in Europe and this process includes an environmental risk assessment (ERA) for non-target arthropods (NTA). Traditionally a non-spatial or field trial assessment is used. In this study we exemplify the introduction of a spatial context to the ERA as well as suggest a way in which the results of complex models, necessary for proper inclusion of spatial aspects in the ERA, can be presented and evaluated easily using abundance and occupancy ratios (AOR). We used an agent-based simulation system and an existing model for a widespread carabid beetle (Bembidion lampros), to evaluate the impact of a fictitious highly-toxic pesticide on population density and the distribution of beetles in time and space. Landscape structure and field margin management were evaluated by comparing scenario-based ERAs for the beetle. Source-sink dynamics led to an off-crop impact even when no pesticide was present off-crop. In addition, the impacts increased with multi-year application of the pesticide whereas current ERA considers only maximally one year. These results further indicated a complex interaction between landscape structure and pesticide effect in time, both in-crop and off-crop, indicating the need for NTA ERA to be conducted at landscape- and multi-season temporal-scales. Use of AOR indices to compare ERA outputs facilitated easy comparison of scenarios, allowing simultaneous evaluation of impacts and planning of mitigation measures. The landscape and population ERA approach also demonstrates that there is a potential to change from regulation of a pesticide in isolation, towards the consideration of pesticide management at landscape scales and provision of biodiversity benefits via inclusion and testing of mitigation measures in authorisation procedures.


Environmental Toxicology and Chemistry | 2012

On the quantification of intertest variability in ecotoxicity data with application to species sensitivity distributions

Graeme L. Hickey; Peter S. Craig; Robert Luttik; Dick de Zwart

Ecotoxicological hazard assessment relies on species effect data to estimate quantities such as the predicted no-effect concentration. While there is a concerted effort to quantify uncertainty in risk assessments, the uncertainty due to intertest variability in species effect measurements is an overlooked component. The European Union Registration, Evaluation, Authorisation, and Restriction of Chemicals (REACH) guidance document suggests that multiple toxicity records for a given chemical-species combination should be aggregated by the geometric mean. Ignoring this issue or applying unjustified so-called harmonization methods weakens the defensibility of uncertainty quantification and interpretation about properties of ecological models, for example, the predicted no-effect concentration. In the present study, the authors propose a simple and broadly theoretically justifiable model to quantify intertest variability and analyze it using Bayesian methods. The value of data in ecotoxicity databases is maximized by using (interval-)censored data. An exploratory analysis is provided to support the model. The authors conclude, based on a large ecotoxicity database of acute effects to aquatic species, that the standard deviation of intertest variability is approximately a factor (or fold-difference) of 3. The consequences for decision makers of (not) adjusting for intertest variability are demonstrated.


Ecotoxicology and Environmental Safety | 2009

On the application of loss functions in determining assessment factors for ecological risk.

Graeme L. Hickey; Peter S. Craig; Andy Hart

Assessment factors have been proposed as a means to extrapolate from data on the concentrations hazardous to a small sample of species to the concentration hazardous to p% of the species in a given community (HCp). Aldenberg and Jaworska [2000. Uncertainty of the hazardous concentration and fraction affected for normal species sensitivity distributions. Ecotoxicol. Environ. Saf. 46, 1-18] proposed estimators that prescribed universal assessment factors which made use of distributional assumptions associated with species sensitivity distributions. In this paper we maintain those assumptions but introduce loss functions which punish over- and under-estimation. Furthermore, the final loss function is parameterised such that conservatism can be asymmetrically and non-linearly controlled which enables one to better represent the reality of risk assessment scenarios. We describe the loss functions and derive Bayes rules for each. We demonstrate the method by producing a table of universal factors that are independent of the substance being assessed and which can be combined with the toxicity data in order to estimate the HC5. Finally, through an example we illustrate the potential strength of the newly proposed estimators which rationally accounts for the costs of under- and over-estimation to choose an estimator; as opposed to arbitrarily choosing a one-sided lower confidence limit.


EFSA Journal | 2017

Guidance on the use of the weight of evidence approach in scientific assessments

Anthony Hardy; Diane Benford; Thorhallur Halldorsson; Michael Jeger; Helle Katrine Knutsen; Simon J. More; Hanspeter Naegeli; Hubert Noteborn; Colin Ockleford; Antonia Ricci; Guido Rychen; Josef Schlatter; Vittorio Silano; Roland Solecki; Dominique Turck; Emilio Benfenati; Qasim Chaudhry; Peter S. Craig; Geoff K Frampton; Matthias Greiner; Andrew Hart; Christer Hogstrand; Claude Lambré; Robert Luttik; David Makowski; Alfonso Siani; Helene Wahlstroem; Jaime Aguilera; J.L.C.M Dorne; Antonio Fernandez Dumont

Abstract EFSA requested the Scientific Committee to develop a guidance document on the use of the weight of evidence approach in scientific assessments for use in all areas under EFSAs remit. The guidance document addresses the use of weight of evidence approaches in scientific assessments using both qualitative and quantitative approaches. Several case studies covering the various areas under EFSAs remit are annexed to the guidance document to illustrate the applicability of the proposed approach. Weight of evidence assessment is defined in this guidance as a process in which evidence is integrated to determine the relative support for possible answers to a question. This document considers the weight of evidence assessment as comprising three basic steps: (1) assembling the evidence into lines of evidence of similar type, (2) weighing the evidence, (3) integrating the evidence. The present document identifies reliability, relevance and consistency as three basic considerations for weighing evidence.

Collaboration


Dive into the Peter S. Craig's collaboration.

Top Co-Authors

Avatar

Andy Hart

Food and Environment Research Agency

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Robert Luttik

European Food Safety Authority

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Anthony Hardy

European Food Safety Authority

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Susanne Hougaard Bennekou

United States Environmental Protection Agency

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge