John J. Peterson
GlaxoSmithKline
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by John J. Peterson.
Journal of Quality Technology | 2004
John J. Peterson
In this paper we present an approach to multiple response surface optimization that not only provides optimal operating conditions, but also measures the reliability of an acceptable quality result for any set of operating conditions. The most utilized multiple response optimization approaches of “overlapping mean responses” or the desirability function do not take into account the variance-covariance structure of the data nor the model parameter uncertainty. Some of the quadratic loss function approaches take into account the variance-covariance structure of the predicted means, but they do not take into account the model parameter uncertainty associated with the variance-covariance matrix of the error terms. For the optimal conditions obtained by these approaches, the probability that they provide a good multivariate response, as measured by that optimization criterion, can be unacceptably low. Furthermore, it is shown that ignoring the model parameter uncertainty can lead to reliability estimates that are too large. The proposed approach can be used with most of the current multiresponse optimization procedures to assess the reliability of a good future response. This approach takes into account the correlation structure of the data, the variability of the process distribution, and the model parameter uncertainty. The utility of this method is illustrated with two examples.
Journal of Biopharmaceutical Statistics | 2008
John J. Peterson
Manufacturers of pharmaceuticals and biopharmaceuticals are facing increased regulatory pressure to understand how their manufacturing processes work and to be able to quantify the reliability and robustness of their manufacturing processes. In particular, the ICH Q8 guidance has introduced the concept of design space. The ICH Q8 defines design space as “the multidimensional combination and interaction of input variables (e.g., material attributes) and process parameters that have been demonstrated to provide assurance of quality.” However, relatively little has been put forth to date on how to construct a design space from data composed of such variables. This study presents a Bayesian approach to design space based upon a type of credible region first appearing in Petersons work.This study considers the issues of constructing a Bayesian design space, design space reliability, the inclusion of process noise variables, and utilization of prior information, as well as an outline for organizing information about a design space so that manufacturing engineers can make informed changes as may be needed within the design space.
Journal of Applied Statistics | 2004
Guillermo Miró-Quesada; Enrique Castillo; John J. Peterson
An approach for the multiple response robust parameter design problem based on a methodology by Peterson (2000) is presented. The approach is Bayesian, and consists of maximizing the posterior predictive probability that the process satisfies a set of constraints on the responses. In order to find a solution robust to variation in the noise variables, the predictive density is integrated not only with respect to the response variables but also with respect to the assumed distribution of the noise variables. The maximization problem involves repeated Monte Carlo integrations, and two different methods to solve it are evaluated. A Matlab code was written that rapidly finds an optimal (robust) solution in case it exists. Two examples taken from the literature are used to illustrate the proposed method.
Quality Technology and Quantitative Management | 2009
John J. Peterson; Guillermo Miró-Quesada; Enrique Castillo
Abstract This paper presents a Bayesian predictive approach to multiresponse optimization experiments. It generalizes the work of Peterson [33] in two ways that make it more flexible for use in applications. First, a multivariate posterior predictive distribution of seemingly unrelated regression models is used to determine optimum factor levels by assessing the reliability of a desired multivariate response. It is shown that it is possible for optimal mean response surfaces to appear satisfactory yet be associated with unsatisfactory overall process reliabilities. Second, the use of a multivariate normal distribution for the vector of regression error terms is generalized to that of the (heavier tailed) multivariate t-distribution. This provides a Bayesian sensitivity analysis with regard to moderate outliers. The effect of adding design points is also considered through a preposterior analysis. The advantages of this approach are illustrated with two real examples.
Journal of Receptors and Signal Transduction | 2007
John J. Peterson; Steven Novick
Human diseases may involve cellular signaling networks that contain redundant pathways, so that blocking a single pathway in the system cannot achieve the desired effect. As such, the use of drugs in combination are particularly effective interventions in networked systems. However, common synergy measures are often inadequate to quantify the effect of two different drugs in complex cellular systems. This article proposes a general approach to quantifying the synergy of two drugs in combination. This approach is called strong nonlinear blending. Drugs with different relative potencies, different effect maxima, or situations of potentiation or coalism pose no problem for strong nonlinear blending as a way to assess the increased response benefit to be gained by combining two drugs. This is important as testing drug combinations in complex biological systems are likely to produce a wide variety of possible response surfaces. It is also shown that for monotone increasing (or decreasing) dose response surfaces that strong nonlinear blending is equivalent to improved potency along a ray of constant dose ratio. This is important because fixed dose ratios form the basis for many preclinical and clinical combination drug experiments. Two examples are given involving HIV and cancer chemotherapy combination drug experiments.
International Immunopharmacology | 2002
Colleen M. Davenport; Holly Ann McAdams; Jen Kou; Kirsten Mascioli; Christopher Eichman; Laura Healy; John J. Peterson; Sreekant Murphy; Domenico Coppola; Alemseged Truneh
Transfer of CD45RBhi CD4 + naïve T cells into severe combined immunodeficient (SCID) mice induces colitis and skin lesions. Recipients treated with cyclosporin A (CsA), CTLA4-Ig, or vehicle were evaluated for weight loss, skin lesions, and cutaneous blood flow. Necropsy, histological, hematological and cytokine analyses were performed at the conclusion of the experiment to confirm the clinical findings. Vehicle-treated mice lost weight and had 100% incidence of skin lesions by 46-days. CsA-treated mice also lost weight, but only 3/8 mice developed mild, clinically evident skin lesions. In contrast, all CTLA4-Ig-treated mice gained weight and did not develop skin lesions. Increase in cutaneous blood flow correlated with the development of skin lesions. Granulocyte numbers, which were high or moderately high in the vehicle- or CsA-treated mice, respectively, remained as low in the CTLA4-Ig-treated group as in untreated mice. IFN-gamma, IL-1beta, and TNF-alpha levels in the gut and skin correlated with the extent of inflammation in both organs. Histology revealed that CTLA4-Ig but not CsA effectively prevented both autoimmune disorders. The ability of CTLA4-Ig to prevent both colitis and skin lesions suggests that CD28-dependent co-stimulation of T cells is critical for generation of pro-inflammatory cytokines and induction of clinical disease in such autoimmune disorders.
Statistics in Biopharmaceutical Research | 2010
John J. Peterson; Kevin Lief
The ICH Q8 defines “design space” (DS) as “The multidimensional combination and interaction of input variables (e.g., material attributes) and process parameters that have been demonstrated to provide assurance of quality.” Unfortunately, some pharmaceutical scientists appear to misinterpret the definition of DS as a process monitoring strategy. A more subtle and possibly more misleading issue, however, is the application of standard response surface methodology software applications in an attempt to construct a DS. The methodology of “overlapping mean responses” (OMR), available in many point-and-click oriented statistical packages, provides a tempting opportunity to use this methodology to create a DS. Furthermore, a few recent (and two possibly very influential) papers have been published that appear to propose the use of OMR as a way to construct a DS. However, such a DS may harbor operating conditions with a low probability of meeting process specifications. In this article we compare the OMR approach with a Bayesian predictive approach to DS, and show that the OMR approach produces DS’s that are too large and may contain conditions with a low probability of meeting process specifications. In some cases, even the best operating conditions do not have a high probability of meeting all process specifications.
Journal of Quality Technology | 2005
Ramkumar Rajagopal; Enrique Castillo; John J. Peterson
The uncertainty of the model form is typically neglected in process-optimization studies. In addition, not taking into account the existence of noise factors and nonnormal errors may invalidate the conclusions of such studies. In this paper, a Bayesian approach to model-robust process optimization in the presence of noise factors and nonnormal error terms is presented. Traditionally, in process optimization, methods such as the dual response approach are used in the presence of noise factors, and methods such as robust regression are used when the error terms are not normally distributed. Instead, this paper extends the recently proposed idea of model form-robustness using a Bayesian predictive approach to cases where there is uncertainty due to noise factors and due to the distributional assumptions of the errors. Two examples taken from the literature, one based on a factorial experiment and another based on a mixture experiment, are used to illustrate the proposed approach.
Statistics in Biopharmaceutical Research | 2009
John J. Peterson; Mohammad Yahyah
The ICH Q2 (R1) Guidance on Validation of Analytical Procedures states that a robustness assessment for an analytical method should provide “an indication of its reliability during normal usage.” The concept of “design space” as specified in the ICH Q8 Guidance may be used to create a zone of reliable robustness for an analytical method or pharmaceutical process. A Bayesian approach to design space as outlined by Peterson (2004) accounts for model parameter uncertainty, correlation among the quality responses at each fixed operating condition, and method response multiplicity. Two examples are provided to illustrate the application of a Bayesian design space to assessing reliability/robustness. One example is about assessing the ability of an HPLC analytical method to meet system suitability criteria and the other deals with a crystallization process for an active pharmaceutical ingredient.
Dissolution Technologies | 2016
Dave LeBlond; Stan Altan; Steven Novick; John J. Peterson; Yan Shen; Harry Yang
Many pharmacologically active molecules are formulated as solid dosage form drug products. Following oral administration, the diffusion of an active molecule from the gastrointestinal tract into systemic distribution requires the disintegration of the dosage form followed by the dissolution of the molecule in the stomach lumen. Its dissolution properties may have a direct impact on its bioavailability and subsequent therapeutic effect. Consequently, dissolution (or in vitro release) testing has been the subject of intense scientific and regulatory interest over the past several decades. Much interest has focused on models describing in vitro release profiles over a time scale, and a number of methods have been proposed for testing similarity of profiles. In this article, we review previously published work on dissolution profile similarity testing and provide a detailed critique of current methods in order to set the stage for a Bayesian approach.