Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Benjamin A. Shaby is active.

Publication


Featured researches published by Benjamin A. Shaby.


The Journal of Neuroscience | 2010

Quantitative relationships between huntingtin levels, polyglutamine length, inclusion body formation, and neuronal death provide novel insight into huntington's disease molecular pathogenesis.

Jason Miller; Montserrat Arrasate; Benjamin A. Shaby; Siddhartha Mitra; Eliezer Masliah; Steven Finkbeiner

An expanded polyglutamine (polyQ) stretch in the protein huntingtin (htt) induces self-aggregation into inclusion bodies (IBs) and causes Huntingtons disease (HD). Defining precise relationships between early observable variables and neuronal death at the molecular and cellular levels should improve our understanding of HD pathogenesis. Here, we used an automated microscope that tracks thousands of neurons individually over their entire lifetime to quantify interconnected relationships between early variables, such as htt levels, polyQ length, and IB formation, and neuronal death in a primary striatal model of HD. The resulting model revealed that mutant htt increases the risk of death by tonically interfering with homeostatic coping mechanisms rather than producing accumulated damage to the neuron, htt toxicity is saturable, the rate-limiting steps for inclusion body formation and death can be traced to different conformational changes in monomeric htt, and IB formation reduces the impact of the starting levels of htt of a neuron on its risk of death. Finally, the model that emerges from our quantitative measurements places critical limits on the potential mechanisms by which mutant htt might induce neurodegeneration, which should help direct future research.


Nature Chemical Biology | 2013

Proteostasis of polyglutamine varies among neurons and predicts neurodegeneration

Andrey S. Tsvetkov; Montserrat Arrasate; Sami J. Barmada; D. Michael Ando; Punita Sharma; Benjamin A. Shaby; Steven Finkbeiner

In polyglutamine (polyQ) diseases, only certain neurons die, despite widespread expression of the offending protein. PolyQ expansion may induce neurodegeneration by impairing proteostasis, but protein aggregation and toxicity tend to confound conventional measurements of protein stability. Here, we used optical pulse labeling to measure effects of polyQ expansions on the mean lifetime of a fragment of huntingtin, the protein that causes Huntingtons disease, in living neurons. We show that polyQ expansion reduced the mean lifetime of mutant huntingtin within a given neuron and that the mean lifetime varied among neurons, indicating differences in their capacity to clear the polypeptide. We found that neuronal longevity is predicted by the mean lifetime of huntingtin, as cortical neurons cleared mutant huntingtin faster and lived longer than striatal neurons. Thus, cell type-specific differences in turnover capacity may contribute to cellular susceptibility to toxic proteins, and efforts to bolster proteostasis in Huntingtons disease, such as protein clearance, could be neuroprotective.


The Annals of Applied Statistics | 2012

A hierarchical max-stable spatial model for extreme precipitation

Brian J. Reich; Benjamin A. Shaby

Extreme environmental phenomena such as major precipitation events manifestly exhibit spatial dependence. Max-stable processes are a class of asymptotically-justified models that are capable of representing spatial dependence among extreme values. While these models satisfy modeling requirements, they are limited in their utility because their corresponding joint likelihoods are unknown for more than a trivial number of spatial locations, preventing, in particular, Bayesian analyses. In this paper, we propose a new random effects model to account for spatial dependence. We show that our specification of the random effect distribution leads to a max-stable process that has the popular Gaussian extreme value process (GEVP) as a limiting case. The proposed model is used to analyze the yearly maximum precipitation from a regional climate model.


Journal of Computational and Graphical Statistics | 2014

Estimation and Prediction in Spatial Models With Block Composite Likelihoods

Jo Eidsvik; Benjamin A. Shaby; Brian J. Reich; Matthew Wheeler; Jarad Niemi

This article develops a block composite likelihood for estimation and prediction in large spatial datasets. The composite likelihood (CL) is constructed from the joint densities of pairs of adjacent spatial blocks. This allows large datasets to be split into many smaller datasets, each of which can be evaluated separately, and combined through a simple summation. Estimates for unknown parameters are obtained by maximizing the block CL function. In addition, a new method for optimal spatial prediction under the block CL is presented. Asymptotic variances for both parameter estimates and predictions are computed using Godambe sandwich matrices. The approach considerably improves computational efficiency, and the composite structure obviates the need to load entire datasets into memory at once, completely avoiding memory limitations imposed by massive datasets. Moreover, computing time can be reduced even further by distributing the operations using parallel computing. A simulation study shows that CL estimates and predictions, as well as their corresponding asymptotic confidence intervals, are competitive with those based on the full likelihood. The procedure is demonstrated on one dataset from the mining industry and one dataset of satellite retrievals. The real-data examples show that the block composite results tend to outperform two competitors; the predictive process model and fixed-rank kriging. Supplementary materials for this article is available online on the journal web site.


Proceedings of the National Academy of Sciences of the United States of America | 2017

Nrf2 mitigates LRRK2- and α-synuclein–induced neurodegeneration by modulating proteostasis

Gaia Skibinski; Vicky Hwang; Dale Michael Ando; Aaron Daub; Alicia K. Lee; Abinaya Ravisankar; Sara Modan; Mariel M. Finucane; Benjamin A. Shaby; Steven Finkbeiner

Significance The prevailing view of nuclear factor erythroid 2-related factor (Nrf2) function in the central nervous system is that it acts by a cell-nonautonomous mechanism to activate a program of gene expression that mitigates reactive oxygen species and the damage that ensues. Our work significantly expands the biological understanding of Nrf2 by showing that Nrf2 mitigates toxicity induced by α-synuclein and leucine-rich repeat kinase 2 (LRRK2), by potently promoting neuronal protein homeostasis in a cell-autonomous and time-dependent fashion. Nrf2 accelerates the clearance of α-synuclein, shortening its half-life and leading to lower overall levels of α-synuclein. By contrast, Nrf2 promotes the aggregation of LRRK2 into inclusion bodies, leading to a significant reduction in diffuse mutant LRRK2 levels elsewhere in the neuron. Mutations in leucine-rich repeat kinase 2 (LRRK2) and α-synuclein lead to Parkinson’s disease (PD). Disruption of protein homeostasis is an emerging theme in PD pathogenesis, making mechanisms to reduce the accumulation of misfolded proteins an attractive therapeutic strategy. We determined if activating nuclear factor erythroid 2-related factor (Nrf2), a potential therapeutic target for neurodegeneration, could reduce PD-associated neuron toxicity by modulating the protein homeostasis network. Using a longitudinal imaging platform, we visualized the metabolism and location of mutant LRRK2 and α-synuclein in living neurons at the single-cell level. Nrf2 reduced PD-associated protein toxicity by a cell-autonomous mechanism that was time-dependent. Furthermore, Nrf2 activated distinct mechanisms to handle different misfolded proteins. Nrf2 decreased steady-state levels of α-synuclein in part by increasing α-synuclein degradation. In contrast, Nrf2 sequestered misfolded diffuse LRRK2 into more insoluble and homogeneous inclusion bodies. By identifying the stress response strategies activated by Nrf2, we also highlight endogenous coping responses that might be therapeutically bolstered to treat PD.


The Annals of Applied Statistics | 2013

Extreme value analysis for evaluating ozone control strategies

Brian J. Reich; Daniel Cooley; Kristen M. Foley; Sergey L. Napelenok; Benjamin A. Shaby

Tropospheric ozone is one of six criteria pollutants regulated by the US EPA, and has been linked to respiratory and cardiovascular endpoints and adverse effects on vegetation and ecosystems. Regional photochemical models have been developed to study the impacts of emission reductions on ozone levels. The standard approach is to run the deterministic model under new emission levels and attribute the change in ozone concentration to the emission control strategy. However, running the deterministic model requires substantial computing time, and this approach does not provide a measure of uncertainty for the change in ozone levels. Recently, a reduced form model (RFM) has been proposed to approximate the complex model as a simple function of a few relevant inputs. In this paper, we develop a new statistical approach to make full use of the RFM to study the effects of various control strategies on the probability and magnitude of extreme ozone events. We fuse the model output with monitoring data to calibrate the RFM by modeling the conditional distribution of monitoring data given the RFM using a combination of flexible semiparametric quantile regression for the center of the distribution where data are abundant and a parametric extreme value distribution for the tail where data are sparse. Selected parameters in the conditional distribution are allowed to vary by the RFM value and the spatial location. Also, due to the simplicity of the RFM, we are able to embed the RFM in our Bayesian hierarchical framework to obtain a full posterior for the model input parameters, and propagate this uncertainty to the estimation of the effects of the control strategies. We use the new framework to evaluate three potential control strategies, and find that reducing mobile-source emissions has a larger impact than reducing point-source emissions or a combination of several emission sources.


Journal of Computational and Graphical Statistics | 2014

The Open-Faced Sandwich Adjustment for MCMC Using Estimating Functions

Benjamin A. Shaby

A situation frequently arises where working with the likelihood function is problematic. This can happen for several reasons—perhaps the likelihood is prohibitively computationally expensive, perhaps it lacks some robustness property, or perhaps it is simply not known for the model under consideration. In these cases, it is often possible to specify alternative functions of the parameters and the data that can be maximized to obtain asymptotically normal estimates. However, these scenarios present obvious problems if one is interested in applying Bayesian techniques. This article describes open-faced sandwich adjustment, a way to incorporate a wide class of nonlikelihood objective functions within Bayesian-like models to obtain asymptotically valid parameter estimates and inference via MCMC. Two simulation examples show that the method provides accurate frequentist uncertainty estimates. The open-faced sandwich adjustment is applied to a Poisson spatio-temporal model to analyze an ornithology dataset from the citizen science initiative eBird. An online supplement contains an appendix with additional figures, tables, and discussion, as well as R code.


Journal of Applied Meteorology and Climatology | 2015

Estimating Spatially Varying Severity Thresholds of a Forest Fire Danger Rating System Using Max-Stable Extreme-Event Modeling*

Alec G. Stephenson; Benjamin A. Shaby; Brian J. Reich; Andrew L. Sullivan

AbstractFire danger indices are used in many countries to estimate the potential fire danger and to issue warnings to local regions. The McArthur fire danger rating system is used in Australia. The McArthur forest fire danger index (FFDI) uses only meteorological elements. It combines information on wind speed, temperature, relative humidity, and recent rainfall to produce a weather index of fire potential. This index is converted into fire danger categories to serve as warnings to the local population and to estimate potential fire-suppression difficulty. FFDI values above the threshold of 75 are rated as extreme. The spatial behavior of large values of the FFDI is modeled to investigate whether a varying threshold across space may serve as a better guide for determining the onset of elevated fire danger. The authors modify and apply a statistical method that was recently developed for spatial extreme events, using a “max-stable” process to model FFDI data at approximately 17 000 data sites. The method t...


The Annals of Applied Statistics | 2016

A Markov-switching model for heat waves

Benjamin A. Shaby; Brian J. Reich; Daniel Cooley; Cari G. Kaufman

Heat waves merit careful study because they inict severe economic and societal damage. We use an intuitive, informal working denition of a heat wave|a persistent event in the tail of the temperature distribution| to motivate an interpretable latent state extreme value model. A latent variable with dependence in time indicates membership in the heat wave state. The strength of the temporal dependence of the latent variable controls the frequency and persistence of heat waves. Within each heat wave, temperatures are modeled using extreme value distributions, with extremal dependence across time accomplished through an extreme value Markov model. One important virtue of interpretability is that model parameters directly translate into quantities of interest for risk management, so that questions like whether heat waves are becoming longer, more severe, or more frequent, are easily answered by querying an appropriate tted model. We demonstrate the latent state model on two recent, calamitous, examples: the European heat wave of 2003 and the Russian heat wave of 2010.


Journal of Computational and Graphical Statistics | 2018

A Spatial Markov Model for Climate Extremes

Brian J. Reich; Benjamin A. Shaby

ABSTRACT Spatial climate data are often presented as summaries of areal regions such as grid cells, either because they are the output of numerical climate models or to facilitate comparison with numerical climate model output. Extreme value analysis can benefit greatly from spatial methods that borrow information across regions. For Gaussian outcomes, a host of methods that respect the areal nature of the data are available, including conditional and simultaneous autoregressive models. However, to our knowledge, there is no such method in the spatial extreme value analysis literature. In this article, we propose a new method for areal extremes that accounts for spatial dependence using latent clustering of neighboring regions. We show that the proposed model has desirable asymptotic dependence properties and leads to relatively simple computation. Applying the proposed method to North American climate data reveals several local and continental-scale changes in the distribution of precipitation and temperature extremes over time. Supplementary material for this article is available online.

Collaboration


Dive into the Benjamin A. Shaby's collaboration.

Top Co-Authors

Avatar

Brian J. Reich

North Carolina State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Daniel Cooley

Colorado State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gregory P. Bopp

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar

Jason Miller

University of California

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge