Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Alexander Luedtke is active.

Publication


Featured researches published by Alexander Luedtke.


BMC Proceedings | 2011

Evaluating methods for the analysis of rare variants in sequence data

Alexander Luedtke; Scott Powers; Ashley Petersen; Alexandra Sitarik; Airat Bekmetjev; Nathan L. Tintle

A number of rare variant statistical methods have been proposed for analysis of the impending wave of next-generation sequencing data. To date, there are few direct comparisons of these methods on real sequence data. Furthermore, there is a strong need for practical advice on the proper analytic strategies for rare variant analysis. We compare four recently proposed rare variant methods (combined multivariate and collapsing, weighted sum, proportion regression, and cumulative minor allele test) on simulated phenotype and next-generation sequencing data as part of Genetic Analysis Workshop 17. Overall, we find that all analyzed methods have serious practical limitations on identifying causal genes. Specifically, no method has more than a 5% true discovery rate (percentage of truly causal genes among all those identified as significantly associated with the phenotype). Further exploration shows that all methods suffer from inflated false-positive error rates (chance that a noncausal gene will be identified as associated with the phenotype) because of population stratification and gametic phase disequilibrium between noncausal SNPs and causal SNPs. Furthermore, observed true-positive rates (chance that a truly causal gene will be identified as significantly associated with the phenotype) for each of the four methods was very low (<19%). The combination of larger than anticipated false-positive rates, low true-positive rates, and only about 1% of all genes being causal yields poor discriminatory ability for all four methods. Gametic phase disequilibrium and population stratification are important areas for further research in the analysis of rare variant data.


Annals of Statistics | 2016

Statistical inference for the mean outcome under a possibly non-unique optimal treatment strategy

Alexander Luedtke; Mark J. van der Laan

We consider challenges that arise in the estimation of the mean outcome under an optimal individualized treatment strategy defined as the treatment rule that maximizes the population mean outcome, where the candidate treatment rules are restricted to depend on baseline covariates. We prove a necessary and sufficient condition for the pathwise differentiability of the optimal value, a key condition needed to develop a regular and asymptotically linear (RAL) estimator of the optimal value. The stated condition is slightly more general than the previous condition implied in the literature. We then describe an approach to obtain root-n rate confidence intervals for the optimal value even when the parameter is not pathwise differentiable. We provide conditions under which our estimator is RAL and asymptotically efficient when the mean outcome is pathwise differentiable. We also outline an extension of our approach to a multiple time point problem. All of our results are supported by simulations.


Journal of causal inference | 2015

Targeted Learning of the Mean Outcome under an Optimal Dynamic Treatment Rule.

Mark J. van der Laan; Alexander Luedtke

Abstract We consider estimation of and inference for the mean outcome under the optimal dynamic two time-point treatment rule defined as the rule that maximizes the mean outcome under the dynamic treatment, where the candidate rules are restricted to depend only on a user-supplied subset of the baseline and intermediate covariates. This estimation problem is addressed in a statistical model for the data distribution that is nonparametric beyond possible knowledge about the treatment and censoring mechanism. This contrasts from the current literature that relies on parametric assumptions. We establish that the mean of the counterfactual outcome under the optimal dynamic treatment is a pathwise differentiable parameter under conditions, and develop a targeted minimum loss-based estimator (TMLE) of this target parameter. We establish asymptotic linearity and statistical inference for this estimator under specified conditions. In a sequentially randomized trial the statistical inference relies upon a second-order difference between the estimator of the optimal dynamic treatment and the optimal dynamic treatment to be asymptotically negligible, which may be a problematic condition when the rule is based on multivariate time-dependent covariates. To avoid this condition, we also develop TMLEs and statistical inference for data adaptive target parameters that are defined in terms of the mean outcome under the estimate of the optimal dynamic treatment. In particular, we develop a novel cross-validated TMLE approach that provides asymptotic inference under minimal conditions, avoiding the need for any empirical process conditions. We offer simulation results to support our theoretical findings.


BMC Proceedings | 2011

Evaluating methods for combining rare variant data in pathway-based tests of genetic association

Ashley Petersen; Alexandra Sitarik; Alexander Luedtke; Scott Powers; Airat Bekmetjev; Nathan L. Tintle

Analyzing sets of genes in genome-wide association studies is a relatively new approach that aims to capitalize on biological knowledge about the interactions of genes in biological pathways. This approach, called pathway analysis or gene set analysis, has not yet been applied to the analysis of rare variants. Applying pathway analysis to rare variants offers two competing approaches. In the first approach rare variant statistics are used to generate p-values for each gene (e.g., combined multivariate collapsing [CMC] or weighted-sum [WS]) and the gene-level p-values are combined using standard pathway analysis methods (e.g., gene set enrichment analysis or Fisher’s combined probability method). In the second approach, rare variant methods (e.g., CMC and WS) are applied directly to sets of single-nucleotide polymorphisms (SNPs) representing all SNPs within genes in a pathway. In this paper we use simulated phenotype and real next-generation sequencing data from Genetic Analysis Workshop 17 to analyze sets of rare variants using these two competing approaches. The initial results suggest substantial differences in the methods, with Fisher’s combined probability method and the direct application of the WS method yielding the best power. Evidence suggests that the WS method works well in most situations, although Fisher’s method was more likely to be optimal when the number of causal SNPs in the set was low but the risk of the causal SNPs was high.


The International Journal of Biostatistics | 2016

Super-Learning of an Optimal Dynamic Treatment Rule.

Alexander Luedtke; Mark J. van der Laan

Abstract We consider the estimation of an optimal dynamic two time-point treatment rule defined as the rule that maximizes the mean outcome under the dynamic treatment, where the candidate rules are restricted to depend only on a user-supplied subset of the baseline and intermediate covariates. This estimation problem is addressed in a statistical model for the data distribution that is nonparametric, beyond possible knowledge about the treatment and censoring mechanisms. We propose data adaptive estimators of this optimal dynamic regime which are defined by sequential loss-based learning under both the blip function and weighted classification frameworks. Rather than a priori selecting an estimation framework and algorithm, we propose combining estimators from both frameworks using a super-learning based cross-validation selector that seeks to minimize an appropriate cross-validated risk. The resulting selector is guaranteed to asymptotically perform as well as the best convex combination of candidate algorithms in terms of loss-based dissimilarity under conditions. We offer simulation results to support our theoretical findings.


Optical Engineering | 2013

Update to single-variable parametric cost models for space telescopes

H. Philip Stahl; Todd Henrichs; Alexander Luedtke; Miranda West

Abstract. Parametric cost models are an important tool routinely used to plan missions, compare concepts, and justify technology investments. In 2010, the article, “Single-variable parametric cost models for space telescopes,” was published [H. P. Stahl et al., Opt. Eng. 49(7), 073006 (2010)]. That paper presented new single-variable cost models for space telescope optical telescope assembly. These models were created by applying standard statistical methods to data collected from 30 different space telescope missions. The results were compared with previously published models. A postpublication independent review of that paper’s database identified several inconsistencies. To correct these inconsistencies, a two-year effort was undertaken to reconcile our database with source documents. This paper updates and revises the findings of our 2010 paper. As a result of the review, some telescopes’ data were removed, some were revised, and data for a few new telescopes were added to the database. As a consequence, there have been changes to the 2010 published results. But our two most important findings remain unchanged: aperture diameter is the primary cost driver for large space telescopes, and it costs more per kilogram to build a low-areal-density low-stiffness telescope than a more massive high-stiffness telescope. One significant difference is that we now report telescope cost to vary linearly from 5% to 30% of total mission cost, instead of the previously reported average of 20%. To fully understand the content of this update, the authors recommend that one also read the 2010 paper.


Human Heredity | 2013

Optimal methods for using posterior probabilities in association testing

Keli Liu; Alexander Luedtke; Nathan L. Tintle

Objective: The use of haplotypes to impute the genotypes of unmeasured single nucleotide variants continues to rise in popularity. Simulation results suggest that the use of the dosage as a one-dimensional summary statistic of imputation posterior probabilities may be optimal both in terms of statistical power and computational efficiency; however, little theoretical understanding is available to explain and unify these simulation results. In our analysis, we provide a theoretical foundation for the use of the dosage as a one-dimensional summary statistic of genotype posterior probabilities from any technology. Methods: We analytically evaluate the dosage, mode and the more general set of all one-dimensional summary statistics of two-dimensional (three posterior probabilities that must sum to 1) genotype posterior probability vectors. Results: We prove that the dosage is an optimal one-dimensional summary statistic under a typical linear disease model and is robust to violations of this model. Simulation results confirm our theoretical findings. Conclusions: Our analysis provides a strong theoretical basis for the use of the dosage as a one-dimensional summary statistic of genotype posterior probability vectors in related tests of genetic association across a wide variety of genetic disease models.


Optical Engineering | 2012

Commentary on multivariable parametric cost model for ground optical telescope assembly

Alexander Luedtke; H. Philip Stahl

In 2005, Stahl et al. published a multivariable parametric cost model for ground telescopes that included primary mirror diameter, diffraction-limited wavelength, and year of development. The model also included a factor for primary mirror segmentation and/or duplication. While the original multivariable model is still relevant, we better explain the rationale behind the model and present a model framework that may better account for prescription similarities. Also, we correct the single-variable diameter model presented in the 2005 Stahl paper with the addition of a leading multiplier.


BMC Proceedings | 2014

Evaluation of the power and type I error of recently proposed family-based tests of association for rare variants

Allison Hainline; Carolina Alvarez; Alexander Luedtke; Brian Greco; Andrew Beck; Nathan L. Tintle

Until very recently, few methods existed to analyze rare-variant association with binary phenotypes in complex pedigrees. We consider a set of recently proposed methods applied to the simulated and real hypertension phenotype as part of the Genetic Analysis Workshop 18. Minimal power of the methods is observed for genes containing variants with weak effects on the phenotype. Application of the methods to the real hypertension phenotype yielded no genes meeting a strict Bonferroni cutoff of significance. Some prior literature connects 3 of the 5 most associated genes (p <1 × 10−4) to hypertension or related phenotypes. Further methodological development is needed to extend these methods to handle covariates, and to explore more powerful test alternatives.


Proceedings of SPIE | 2012

Update on multivariable parametric cost models for ground and space telescopes

H. Philip Stahl; Todd Henrichs; Alexander Luedtke; Miranda West

Parametric cost models can be used by designers and project managers to perform relative cost comparisons between major architectural cost drivers and allow high-level design trades; enable cost-benefit analysis for technology development investment; and, provide a basis for estimating total project cost between related concepts. This paper reports on recent revisions and improvements to our ground telescope cost model and refinements of our understanding of space telescope cost models. One interesting observation is that while space telescopes are 50X to 100X more expensive than ground telescopes, their respective scaling relationships are similar. Another interesting speculation is that the role of technology development may be different between ground and space telescopes. For ground telescopes, the data indicates that technology development tends to reduce cost by approximately 50% every 20 years. But for space telescopes, there appears to be no such cost reduction because we do not tend to re-fly similar systems. Thus, instead of reducing cost, 20 years of technology development may be required to enable a doubling of space telescope capability. Other findings include: mass should not be used to estimate cost; spacecraft and science instrument costs account for approximately 50% of total mission cost; and, integration and testing accounts for only about 10% of total mission cost.

Collaboration


Dive into the Alexander Luedtke's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

H. Philip Stahl

Marshall Space Flight Center

View shared research outputs
Top Co-Authors

Avatar

Miranda West

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Todd Henrichs

Middle Tennessee State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrew Beck

Loyola University Chicago

View shared research outputs
Researchain Logo
Decentralizing Knowledge