Jay Bartroff
University of Southern California
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jay Bartroff.
Journal of Pharmacokinetics and Pharmacodynamics | 2013
Tatiana V. Tatarinova; Michael Neely; Jay Bartroff; Michael Van Guilder; Walter M. Yamada; David S. Bayard; Roger W. Jelliffe; Robert Leary; Alyona Chubatiuk; Alan Schumitzky
Population pharmacokinetic (PK) modeling methods can be statistically classified as either parametric or nonparametric (NP). Each classification can be divided into maximum likelihood (ML) or Bayesian (B) approaches. In this paper we discuss the nonparametric case using both maximum likelihood and Bayesian approaches. We present two nonparametric methods for estimating the unknown joint population distribution of model parameter values in a pharmacokinetic/pharmacodynamic (PK/PD) dataset. The first method is the NP Adaptive Grid (NPAG). The second is the NP Bayesian (NPB) algorithm with a stick-breaking process to construct a Dirichlet prior. Our objective is to compare the performance of these two methods using a simulated PK/PD dataset. Our results showed excellent performance of NPAG and NPB in a realistically simulated PK study. This simulation allowed us to have benchmarks in the form of the true population parameters to compare with the estimates produced by the two methods, while incorporating challenges like unbalanced sample times and sample numbers as well as the ability to include the covariate of patient weight. We conclude that both NPML and NPB can be used in realistic PK/PD population analysis problems. The advantages of one versus the other are discussed in the paper. NPAG and NPB are implemented in R and freely available for download within the Pmetrics package from www.lapk.org.
Statistics in Medicine | 2008
Jay Bartroff; Tze Leung Lai
Adaptive designs have been proposed for clinical trials in which the nuisance parameters or alternative of interest are unknown or likely to be misspecified before the trial. Although most previous works on adaptive designs and mid-course sample size re-estimation have focused on two-stage or group-sequential designs in the normal case, we consider here a new approach that involves at most three stages and is developed in the general framework of multiparameter exponential families. This approach not only maintains the prescribed type I error probability but also provides a simple but asymptotically efficient sequential test whose finite-sample performance, measured in terms of the expected sample size and power functions, is shown to be comparable to the optimal sequential design, determined by dynamic programming, in the simplified normal mean case with known variance and prespecified alternative, and superior to the existing two-stage designs and also to adaptive group-sequential designs when the alternative or nuisance parameters are unknown or misspecified.
Archive | 2013
Jay Bartroff; Tze Leung Lai; Mei-Chiung Shih
Features 7 Interdisciplinary approach that Statistics researchers and advanced students will use, in addition to Statisticians working in medical fields 7 Includes recent work that provides a new class of adaptive designs which are both flexible and efficient a new development for adaptive design literature 7 Uses a unique approach to: early phase I and II trials, both information and time-sequential phase III trials, the analysis following a clinical trial, the interactions between each of the stages
Communications in Statistics-theory and Methods | 2010
Jay Bartroff; Tze Leung Lai
Conventional multiple hypothesis tests use step-up, step-down, or closed testing methods to control the overall error rates. We will discuss marrying these methods with adaptive multistage sampling rules and stopping rules to perform efficient multiple hypothesis testing in sequential experimental designs. The result is a multistage step-down procedure that adaptively tests multiple hypotheses while preserving the family-wise error rate and extends Holms (1979) step-down procedure to the sequential setting, yielding substantial savings in sample size with small loss in power.
Sequential Analysis | 2008
Jay Bartroff; Tze Leung Lai
Abstract A new approach to adaptive design of clinical trials is proposed in a general multiparameter exponential family setting, based on generalized likelihood ratio statistics and optimal sequential testing theory. These designs are easy to implement, maintain the prescribed Type I error probability, and are asymptotically efficient. Practical issues involved in clinical trials allowing mid-course adaptation and the large literature on this subject are discussed, and comparisons between the proposed and existing designs are presented in extensive simulation studies of their finite-sample performance, measured in terms of the expected sample size and power functions.
Pharmacological Research | 2011
Roger W. Jelliffe; Michael Neely; Alan Schumitzky; David S. Bayard; Michael Van Guilder; Andreas Botnen; Aida Bustad; Derek Laing; Walter M. Yamada; Jay Bartroff; Tatiana V. Tatarinova
We read with great interest the article by Premaud et al., in your ournal [1], The nonparametric (NP) population modeling approach stimated the model parameter values and predicted the observed ycophenolic acid concentrations more precisely than the paraetric method did. We expect this because the NP method makes, s the authors say, no assumptions about the shape of the model arameter distributions, as parametric methods do. We would like o respectfully offer the comments below in the hope that they will e well taken by a group we respect very highly.
Advances in Applied Probability | 2010
Jay Bartroff; Larry B. Goldstein; Yosef Rinott; Ester Samuel-Cahn
We study a class of optimal allocation problems, including the well-known bomber problem, with the following common probabilistic structure. An aircraft equipped with an amount x of ammunition is intercepted by enemy airplanes arriving according to a homogeneous Poisson process over a fixed time duration t. Upon encountering an enemy, the aircraft has the choice of spending any amount 0 ≤ y ≤ x of its ammunition, resulting in the aircrafts survival with probability equal to some known increasing function of y. Two different goals have been considered in the literature concerning the optimal amount K(x, t) of ammunition spent: (i) maximizing the probability of surviving for time t, which is the so-called bomber problem; and (ii) maximizing the number of enemy airplanes shot down during time t, which we call the fighter problem. Several authors have attempted to settle the following conjectures about the monotonicity of K(x, t): (A) K(x, t) is decreasing in t; (B) K(x, t) is increasing in x; and (C) the amount x - K(x, t) held back is increasing in x. Conjectures (A) and (C) have been shown for the bomber problem with discrete ammunition, while (B) is still an open question. In this paper we consider both time and ammunition to be continuous, and, for the bomber problem, we prove (A) and (C), while, for the fighter problem, we prove (A) and (C) for one special case and (B) and (C) for another. These proofs involve showing that the optimal survival probability and optimal number shot down are totally positive of order 2 (TP2) in the bomber and fighter problems, respectively. The TP2 property is shown by constructing convergent sequences of approximating functions through an iterative operation which preserves TP2 and other properties.
Bernoulli | 2018
Jay Bartroff; Larry B. Goldstein; Ümit Işlak
Threshold-type counts based on multivariate occupancy models with log concave marginals admit bounded size biased couplings under weak conditions, leading to new concentration of measure results for random graphs, germ-grain models in stochastic geometry, multinomial allocation models and multivariate hypergeometric sampling. The work generalizes and improves upon previous results in a number of directions.
Annals of Statistics | 2007
Jay Bartroff
A family of variable stage size multistage tests of simple hypotheses is described, based on efficient multistage sampling procedures. Using a loss function that is a linear combination of sampling costs and error probabilities, these tests are shown to minimize the integrated risk to second order as the costs per stage and per observation approach zero. A numerical study shows significant improvement over group sequential tests in a binomial testing problem.
Advances in Applied Probability | 2011
Jay Bartroff; Ester Samuel-Cahn
In this paper we study the fighter problem with discrete ammunition. An aircraft (fighter) equipped with n anti-aircraft missiles is intercepted by enemy airplanes, the appearance of which follows a homogeneous Poisson process with known intensity. If j of the n missiles are spent at an encounter, they destroy an enemy plane with probability a(j), where a(0) = 0 and {a(j)} is a known, strictly increasing concave sequence, e.g. a(j) = 1 - q j , 0 < q < 1. If the enemy is not destroyed, the enemy shoots the fighter down with known probability 1 - u, where 0 ≤ u ≤ 1. The goal of the fighter is to shoot down as many enemy airplanes as possible during a given time period [0, T]. Let K(n, t) be the smallest optimal number of missiles to be used at a present encounter, when the fighter has flying time t remaining and n missiles remaining. Three seemingly obvious properties of K(n, t) have been conjectured: (A) the closer to the destination, the more of the n missiles one should use; (B) the more missiles one has; the more one should use; and (C) the more missiles one has, the more one should save for possible future encounters. We show that (C) holds for all 0 ≤ u ≤ 1, that (A) and (B) hold for the ‘invincible fighter’ (u = 1), and that (A) holds but (B) fails for the ‘frail fighter’ (u = 0); the latter is shown through a surprising counterexample, which is also valid for small u > 0 values.