Featured Researches

Methodology

Currents and K-functions for Fiber Point Processes

Analysis of images of sets of fibers such as myelin sheaths or skeletal muscles must account for both the spatial distribution of fibers and differences in fiber shape. This necessitates a combination of point process and shape analysis methodology. In this paper, we develop a K-function for shape-valued point processes by embedding shapes as currents, thus equipping the point process domain with metric structure inherited from a reproducing kernel Hilbert space. We extend Ripley's K-function which measures deviations from spatial homogeneity of point processes to fiber data. The paper provides a theoretical account of the statistical foundation of the K-function and its extension to fiber data, and we test the developed K-function on simulated as well as real data sets. This includes a fiber data set consisting of myelin sheaths, visualizing the spatial and fiber shape behavior of myelin configurations at different debts.

Read more
Methodology

D-STEM v2: A Software for Modelling Functional Spatio-Temporal Data

Functional spatio-temporal data naturally arise in many environmental and climate applications where data are collected in a three-dimensional space over time. The MATLAB D-STEM v1 software package was first introduced for modelling multivariate space-time data and has been recently extended to D-STEM v2 to handle functional data indexed across space and over time. This paper introduces the new modelling capabilities of D-STEM v2 as well as the complexity reduction techniques required when dealing with large data sets. Model estimation, validation and dynamic kriging are demonstrated in two case studies, one related to ground-level air quality data in Beijing, China, and the other one related to atmospheric profile data collected globally through radio sounding.

Read more
Methodology

Decomposing spectral and phasic differences in non-linear features between datasets

When employing non-linear methods to characterise complex systems, it is important to determine to what extent they are capturing genuine non-linear phenomena that could not be assessed by simpler spectral methods. Specifically, we are concerned with the problem of quantifying spectral and phasic effects on an observed difference in a non-linear feature between two systems (or two states of the same system). Here we derive, from a sequence of null models, a decomposition of the difference in an observable into spectral, phasic, and spectrum-phase interaction components. Our approach makes no assumptions about the structure of the data and adds nuance to a wide range of time series analyses.

Read more
Methodology

Deep Historical Borrowing Framework to Prospectively and Simultaneously Synthesize Control Information in Confirmatory Clinical Trials with Multiple Endpoints

In current clinical trial development, historical information is receiving more attention as providing value beyond sample size calculation. Meta-analytic-predictive (MAP) priors and robust MAP priors have been proposed for prospectively borrowing historical data on a single endpoint. To simultaneously synthesize control information from multiple endpoints in confirmatory clinical trials, we propose to approximate posterior probabilities from a Bayesian hierarchical model and estimate critical values by deep learning to construct pre-specified decision functions before the trial conduct. Simulation studies and a case study demonstrate that our method additionally preserves power, and has a satisfactory performance under prior-data conflict.

Read more
Methodology

Defining and Estimating Subgroup Mediation Effects with Semi-Competing Risks Data

In many medical studies, an ultimate failure event such as death is likely to be affected by the occurrence and timing of other intermediate clinical events. Both event times are subject to censoring by loss-to-follow-up but the nonterminal event may further be censored by the occurrence of the primary outcome, but not vice versa. To study the effect of an intervention on both events, the intermediate event may be viewed as a mediator, but conventional definition of direct and indirect effects is not applicable due to semi-competing risks data structure. We define three principal strata based on whether the potential intermediate event occurs before the potential failure event, which allow proper definition of direct and indirect effects in one stratum whereas total effects are defined for all strata. We discuss the identification conditions for stratum-specific effects, and proposed a semiparametric estimator based on a multivariate logistic stratum membership model and within-stratum proportional hazards models for the event times. By treating the unobserved stratum membership as a latent variable, we propose an EM algorithm for computation. We study the asymptotic properties of the estimators by the modern empirical process theory and examine the performance of the estimators in numerical studies.

Read more
Methodology

Density estimation and modeling on symmetric spaces

In many applications, data and/or parameters are supported on non-Euclidean manifolds. It is important to take into account the geometric structure of manifolds in statistical analysis to avoid misleading results. Although there has been a considerable focus on simple and specific manifolds, there is a lack of general and easy-to-implement statistical methods for density estimation and modeling on manifolds. In this article, we consider a very broad class of manifolds: non-compact Riemannian symmetric spaces. For this class, we provide a very general mathematical result for easily calculating volume changes of the exponential and logarithm map between the tangent space and the manifold. This allows one to define statistical models on the tangent space, push these models forward onto the manifold, and easily calculate induced distributions by Jacobians. To illustrate the statistical utility of this theoretical result, we provide a general method to construct distributions on symmetric spaces. In particular, we define the log-Gaussian distribution as an analogue of the multivariate Gaussian distribution in Euclidean space. With these new kernels on symmetric spaces, we also consider the problem of density estimation. Our proposed approach can use any existing density estimation approach designed for Euclidean spaces and push it forward to the manifold with an easy-to-calculate adjustment. We provide theorems showing that the induced density estimators on the manifold inherit the statistical optimality properties of the parent Euclidean density estimator; this holds for both frequentist and Bayesian nonparametric methods. We illustrate the theory and practical utility of the proposed approach on the space of positive definite matrices.

Read more
Methodology

Design and Analysis of Switchback Experiments

In switchback experiments, a firm sequentially exposes an experimental unit to a random treatment, measures its response, and repeats the procedure for several periods to determine which treatment leads to the best outcome. Although practitioners have widely adopted this experimental design technique, the development of its theoretical properties and the derivation of optimal design procedures have been, to the best of our knowledge, elusive. In this paper, we address these limitations by establishing the necessary results to ensure that practitioners can apply this powerful class of experiments with minimal assumptions. Our main result is the derivation of the optimal design of switchback experiments under a range of different assumptions on the order of carryover effect - that is, the length of time a treatment persists in impacting the outcome. We cast the experimental design problem as a minimax discrete robust optimization problem, identify the worst-case adversarial strategy, establish structural results for the optimal design, and finally solve the problem via a continuous relaxation. For the optimal design, we derive two approaches for performing inference after running the experiment. The first provides exact randomization based p -values and the second uses a finite population central limit theorem to conduct conservative hypothesis tests and build confidence intervals. We further provide theoretical results for our inferential procedures when the order of the carryover effect is misspecified. For firms that possess the capability to run multiple switchback experiments, we also provide a data-driven strategy to identify the likely order of carryover effect. To study the empirical properties of our results, we conduct extensive simulations. We conclude the paper by providing some practical suggestions.

Read more
Methodology

Design of phase III trials with long-term survival outcomes based on short-term binary results

Pathologic complete response (pCR) is a common primary endpoint for a phase II trial or even accelerated approval of neoadjuvant cancer therapy. If granted, a two-arm confirmatory trial is often required to demonstrate the efficacy with a time-to-event outcome such as overall survival. However, the design of a subsequent phase III trial based on prior information on the pCR effect is not straightforward. Aiming at designing such phase III trials with overall survival as primary endpoint using pCR information from previous trials, we consider a mixture model that incorporates both the survival and the binary endpoints. We propose to base the comparison between arms on the difference of the restricted mean survival times, and show how the effect size and sample size for overall survival rely on the probability of the binary response and the survival distribution by response status, both for each treatment arm. Moreover, we provide the sample size calculation under different scenarios and accompany them with an R package where all the computations have been implemented. We evaluate our proposal with a simulation study, and illustrate its application through a neoadjuvant breast cancer trial.

Read more
Methodology

Designing Experiments Informed by Observational Studies

The increasing availability of passively observed data has yielded a growing methodological interest in "data fusion." These methods involve merging data from observational and experimental sources to draw causal conclusions -- and they typically require a precarious tradeoff between the unknown bias in the observational dataset and the often-large variance in the experimental dataset. We propose an alternative approach to leveraging observational data, which avoids this tradeoff: rather than using observational data for inference, we use it to design a more efficient experiment. We consider the case of a stratified experiment with a binary outcome, and suppose pilot estimates for the stratum potential outcome variances can be obtained from the observational study. We extend results from Zhao et al. (2019) in order to generate confidence sets for these variances, while accounting for the possibility of unmeasured confounding. Then, we pose the experimental design problem as one of regret minimization, subject to the constraints imposed by our confidence sets. We show that this problem can be converted into a convex minimization and solved using conventional methods. Lastly, we demonstrate the practical utility of our methods using data from the Women's Health Initiative.

Read more
Methodology

Designing Transportable Experiments

We consider the problem of designing a randomized experiment on a source population to estimate the Average Treatment Effect (ATE) on a target population. We propose a novel approach which explicitly considers the target when designing the experiment on the source. Under the covariate shift assumption, we design an unbiased importance-weighted estimator for the target population's ATE. To reduce the variance of our estimator, we design a covariate balance condition (Target Balance) between the treatment and control groups based on the target population. We show that Target Balance achieves a higher variance reduction asymptotically than methods that do not consider the target population during the design phase. Our experiments illustrate that Target Balance reduces the variance even for small sample sizes.

Read more

Ready to get started?

Join us today