Steve E. Bellan
University of Texas at Austin
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Steve E. Bellan.
PLOS Neglected Tropical Diseases | 2012
Wolfgang Beyer; Steve E. Bellan; Gisela Eberle; Holly H. Ganz; Wayne M. Getz; Renate Haumacher; Karen A. Hilss; Werner Kilian; Judith Lazak; Wendy C. Turner; Peter C. B. Turnbull
The recent development of genetic markers for Bacillus anthracis has made it possible to monitor the spread and distribution of this pathogen during and between anthrax outbreaks. In Namibia, anthrax outbreaks occur annually in the Etosha National Park (ENP) and on private game and livestock farms. We genotyped 384 B. anthracis isolates collected between 1983–2010 to identify the possible epidemiological correlations of anthrax outbreaks within and outside the ENP and to analyze genetic relationships between isolates from domestic and wild animals. The isolates came from 20 animal species and from the environment and were genotyped using a 31-marker multi-locus-VNTR-analysis (MLVA) and, in part, by twelve single nucleotide polymorphism (SNP) markers and four single nucleotide repeat (SNR) markers. A total of 37 genotypes (GT) were identified by MLVA, belonging to four SNP-groups. All GTs belonged to the A-branch in the cluster- and SNP-analyses. Thirteen GTs were found only outside the ENP, 18 only within the ENP and 6 both inside and outside. Genetic distances between isolates increased with increasing time between isolations. However, genetic distance between isolates at the beginning and end of the study period was relatively small, indicating that while the majority of GTs were only found sporadically, three genetically close GTs, accounting for more than four fifths of all the ENP isolates, appeared dominant throughout the study period. Genetic distances among isolates were significantly greater for isolates from different host species, but this effect was small, suggesting that while species-specific ecological factors may affect exposure processes, transmission cycles in different host species are still highly interrelated. The MLVA data were further used to establish a model of the probable evolution of GTs within the endemic region of the ENP. SNR-analysis was helpful in correlating an isolate with its source but did not elucidate epidemiological relationships.
The Lancet | 2013
Steve E. Bellan; Kathryn J. Fiorella; Dessalegn Y. Melesse; Wayne M. Getz; Brian Williams; Jonathan Dushoff
BACKGROUND The proportion of heterosexual HIV transmission in sub-Saharan Africa that occurs within cohabiting partnerships, compared with that in single people or extra-couple relationships, is widely debated. We estimated the proportional contribution of different routes of transmission to new HIV infections. As plans to use antiretroviral drugs as a strategy for population-level prevention progress, understanding the importance of different transmission routes is crucial to target intervention efforts. METHODS We built a mechanistic model of HIV transmission with data from Demographic and Health Surveys (DHS) for 2003-2011, of 27,201 cohabiting couples (men aged 15-59 years and women aged 15-49 years) from 18 sub-Saharan African countries with information about relationship duration, age at sexual debut, and HIV serostatus. We combined this model with estimates of HIV survival times and country-specific estimates of HIV prevalence and coverage of antiretroviral therapy (ART). We then estimated the proportion of recorded infections in surveyed cohabiting couples that occurred before couple formation, between couple members, and because of extra-couple intercourse. FINDINGS In surveyed couples, we estimated that extra-couple transmission accounted for 27-61% of all HIV infections in men and 21-51% of all those in women, with ranges showing intercountry variation. We estimated that in 2011, extra-couple transmission accounted for 32-65% of new incident HIV infections in men in cohabiting couples, and 10-47% of new infections in women in such couples. Our findings suggest that transmission within couples occurs largely from men to women; however, the latter sex have a very high-risk period before couple formation. INTERPRETATION Because of the large contribution of extra-couple transmission to new HIV infections, interventions for HIV prevention should target the general sexually active population and not only serodiscordant couples. FUNDING US National Institutes of Health, US National Science Foundation, and J S McDonnell Foundation.
The Lancet | 2014
Steve E. Bellan; Juliet R. C. Pulliam; Jonathan Dushoff; Lauren Ancel Meyers
Evidence suggests that many Ebola infections are asymptomatic, a factor overlooked by recent outbreak summaries and projections. Particularly, results from one post-Ebola outbreak serosurvey showed that 71% of seropositive individuals did not have the disease; another study reported that 46% of asymptomatic close contacts of patients with Ebola were seropositive. Although asymptomatic infections are unlikely to be infec tious, they might confer protective immunity and thus have important epidemiological consequences. Although a forceful response is needed, forecasts that ignore natural ly acquired immunity from asymptomatic infections overestimate incidence late in epidemics. We illustrate this point by comparing the projections of two simple models based on the Ebola epidemic in Liberia, a model that does not account for asymptomatic infections, and another that assumes 50% of infections are asymptomatic and induce protective immunity. In both models, the basic reproduction number (R0) is identical and based on published estimates. The figure shows the projected cumulative incidence through time. Although the initial outbreaks are almost identical, by Jan 10, the model without asymptomatic infections projects 50% more cumulative symptomatic cases than the model that accounts for asymptomatic infection. This difference arises because asymptomatic infection contributes to herd immunity and thereby dampens epidemic spread. W i d e s p r e a d a s y m p t o m a t i c immunity would likewise have implications for Ebola control measures and should be considered when planning intervention strategies. For instance, should a safe and effective vaccine become available, the vaccination coverage needed for elimination will depend on pre-existing immunity in the population (appendix). Immunity resulting from asymptomatic infections should reduce the intervention effort needed to interrupt transmission but might also complicate the design and interpretation of vaccine trials. Trials and interventions are likely to target exactly those high-risk populations most likely to have been asymptomatically immunised. Thus, for assessment of vaccines and other countermeasures, baseline serum should be collected to improve both estimates of intervention effectiveness and our understanding of asymptomatic immunity. Additionally, assessment of intervention measures should account for the contribution of asymptomatic immunity in curbing epidemic spread. A s y m p t o m a t i c i n f e c t i o n could also potentially be directly harnessed to mitigate transmission. If individuals who have cleared asymptomatic infections could be identified reliably, and if they are indeed immune to symptomatic re-infection, they could potentially be recruited to serve as caregivers or to undertake other high-risk disease control tasks, providing a buffer akin to that of ring vaccination. Recruitment of such individuals might be preferable to enlistment of survivors of symptomatic Ebola disease because survivors might experience psychological trauma or stigmatisation and be fewer in number—in view of the asymptomatic proportions suggested in previous studies and the low survival rate of symptomatic cases. Health-care workers with natural immunity acquired from asymptomatic infection, if identifi ed, could be allocated to care for acutely ill and infectious patients, minimising disease spread to susceptible health-care workers. The conclusions above depend on whether asymptomatic infections are common, and protective against future infection. Further, strategies to leverage protective immunity will depend on the development and validation of assays that can reliably identify individuals who are effectively protected against re-infection. Previous studies have identified many asymptomatic infections using IgM and IgG antibody assays and PCR, which, although indicative of infection, do not necessarily imply protective immunity. Evidence for long-term protective immunity reported in (symptomatic) Ebola survivors is suggestive, but the extent of protective immunity after asymptomatic infection and the identifi cation of serological markers for protective immunity can only be defi nitively addressed in settings with ongoing transmission risk. As has been proposed for vaccination, the epidemic therefore provides a unique opportunity to investigate asymptomatically acquired protective immunity to Ebola virus. Although resources are scarce, now is the
PLOS ONE | 2010
Steve E. Bellan
Nearly all mathematical models of vector-borne diseases have assumed that vectors die at constant rates. However, recent empirical research suggests that mosquito mortality rates are frequently age dependent. This work develops a simple mathematical model to assess how relaxing the classical assumption of constant mortality affects the predicted effectiveness of anti-vectorial interventions. The effectiveness of mosquito control when mosquitoes die at age dependent rates was also compared across different extrinsic incubation periods. Compared to a more realistic age dependent model, constant mortality models overestimated the sensitivity of disease transmission to interventions that reduce mosquito survival. Interventions that reduce mosquito survival were also found to be slightly less effective when implemented in systems with shorter EIPs. Future transmission models that examine anti-vectorial interventions should incorporate realistic age dependent mortality rates.
PLOS Medicine | 2015
Steve E. Bellan; Jonathan Dushoff; Alison P. Galvani; Lauren Ancel Meyers
Background The infectivity of the HIV-1 acute phase has been directly measured only once, from a retrospectively identified cohort of serodiscordant heterosexual couples in Rakai, Uganda. Analyses of this cohort underlie the widespread view that the acute phase is highly infectious, even more so than would be predicted from its elevated viral load, and that transmission occurring shortly after infection may therefore compromise interventions that rely on diagnosis and treatment, such as antiretroviral treatment as prevention (TasP). Here, we re-estimate the duration and relative infectivity of the acute phase, while accounting for several possible sources of bias in published estimates, including the retrospective cohort exclusion criteria and unmeasured heterogeneity in risk. Methods and Findings We estimated acute phase infectivity using two approaches. First, we combined viral load trajectories and viral load-infectivity relationships to estimate infectivity trajectories over the course of infection, under the assumption that elevated acute phase infectivity is caused by elevated viral load alone. Second, we estimated the relative hazard of transmission during the acute phase versus the chronic phase (RHacute) and the acute phase duration (d acute) by fitting a couples transmission model to the Rakai retrospective cohort using approximate Bayesian computation. Our model fit the data well and accounted for characteristics overlooked by previous analyses, including individual heterogeneity in infectiousness and susceptibility and the retrospective cohorts exclusion of couples that were recorded as serodiscordant only once before being censored by loss to follow-up, couple dissolution, or study termination. Finally, we replicated two highly cited analyses of the Rakai data on simulated data to identify biases underlying the discrepancies between previous estimates and our own. From the Rakai data, we estimated RHacute = 5.3 (95% credibility interval [95% CrI]: 0.79–57) and d acute = 1.7 mo (95% CrI: 0.55–6.8). The wide credibility intervals reflect an inability to distinguish a long, mildly infectious acute phase from a short, highly infectious acute phase, given the 10-mo Rakai observation intervals. The total additional risk, measured as excess hazard-months attributable to the acute phase (EHMacute) can be estimated more precisely: EHMacute = (RHacute - 1) × d acute, and should be interpreted with respect to the 120 hazard-months generated by a constant untreated chronic phase infectivity over 10 y of infection. From the Rakai data, we estimated that EHMacute = 8.4 (95% CrI: -0.27 to 64). This estimate is considerably lower than previously published estimates, and consistent with our independent estimate from viral load trajectories, 5.6 (95% confidence interval: 3.3–9.1). We found that previous overestimates likely stemmed from failure to account for risk heterogeneity and bias resulting from the retrospective cohort study design. Our results reflect the interaction between the retrospective cohort exclusion criteria and high (47%) rates of censorship amongst incident serodiscordant couples in the Rakai study due to loss to follow-up, couple dissolution, or study termination. We estimated excess physiological infectivity during the acute phase from couples data, but not the proportion of transmission attributable to the acute phase, which would require data on the broader populations sexual network structure. Conclusions Previous EHMacute estimates relying on the Rakai retrospective cohort data range from 31 to 141. Our results indicate that these are substantial overestimates of HIV-1 acute phase infectivity, biased by unmodeled heterogeneity in transmission rates between couples and by inconsistent censoring. Elevated acute phase infectivity is therefore less likely to undermine TasP interventions than previously thought. Heterogeneity in infectiousness and susceptibility may still play an important role in intervention success and deserves attention in future analyses
Journal of Wildlife Diseases | 2012
Steve E. Bellan; Carrie A. Cizauskas; Jacobeth Miyen; Karen Ebersohn; Martina Küsters; Katherine C. Prager; Moritz Van Vuuren; Claude Sabeta; Wayne M. Getz
Canine distemper virus (CDV) and rabies virus (RABV) occur worldwide in wild carnivore and domestic dog populations and pose threats to wildlife conservation and public health. In Etosha National Park (ENP), Namibia, anthrax is endemic and generates carcasses frequently fed on by an unusually dense population of black-backed jackals (Canis mesomelas). Using serology, phylogenetic analyses (on samples obtained from February 2009–July 2010), and historical mortality records (1975–2011), we assessed jackal exposure to Bacillus anthracis (BA; the causal bacterial agent of anthrax), CDV, and RABV. Prevalence of antibodies against BA (95%, n=86) and CDV (71%, n=80) was relatively high, while that of antibodies against RABV was low (9%, n=81). Exposure to BA increased significantly with age, and all animals >6 mo old were antibody-positive. As with BA, prevalence of antibodies against CDV increased significantly with age, with similar age-specific trends during both years of the study. No significant effect of age was found on the prevalence of antibodies against RABV. Three of the seven animals with antibodies against RABV were monitored for more than 1 yr after sampling and showed no signs of active infection. Mortality records revealed that rabid animals are destroyed nearly every year inside the ENP tourist camps. Phylogenetic analyses demonstrated that jackal RABV in ENP is part of the same transmission cycle as other dog-jackal RABV cycles in Namibia.
Applied and Environmental Microbiology | 2013
Steve E. Bellan; Peter C. B. Turnbull; Wolfgang Beyer; Wayne M. Getz
ABSTRACT Scavenging of anthrax carcasses has long been hypothesized to play a critical role in the production of the infectious spore stage of Bacillus anthracis after host death, though empirical studies assessing this are lacking. We compared B. anthracis spore production, distribution, and survival at naturally occurring anthrax herbivore carcasses that were either experimentally caged to exclude vertebrate scavengers or left unmanipulated. We found no significant effect of scavengers on soil spore density (P > 0.05). Soil stained with terminally hemorrhaged blood and with nonhemorrhagic fluids exhibited high levels of B. anthracis spore contamination (ranging from 103 to 108 spores/g), even in the absence of vertebrate scavengers. At most of the carcass sites, we also found that spore density in samples taken from hemorrhagic-fluid-stained soil continued to increase for >4 days after host death. We conclude that scavenging by vertebrates is not a critical factor in the life cycle of B. anthracis and that anthrax control measures relying on deterrence or exclusion of vertebrate scavengers to prevent sporulation are unlikely to be effective.
Methods in Ecology and Evolution | 2013
Steve E. Bellan; Olivier Gimenez; Rémi Choquet; Wayne M. Getz
Distance sampling is widely used to estimate the abundance or density of wildlife populations. Methods to estimate wildlife mortality rates have developed largely independently from distance sampling, despite the conceptual similarities between estimation of cumulative mortality and the population density of living animals. Conventional distance sampling analyses rely on the assumption that animals are distributed uniformly with respect to transects and thus require randomized placement of transects during survey design. Because mortality events are rare, however, it is often not possible to obtain precise estimates in this way without infeasible levels of effort. A great deal of wildlife data, including mortality data, is available via road-based surveys. Interpreting these data in a distance sampling framework requires accounting for the non-uniformity sampling. Additionally, analyses of opportunistic mortality data must account for the decline in carcass detectability through time. We develop several extensions to distance sampling theory to address these problems.We build mortality estimators in a hierarchical framework that integrates animal movement data, surveillance effort data, and motion-sensor camera trap data, respectively, to relax the uniformity assumption, account for spatiotemporal variation in surveillance effort, and explicitly model carcass detection and disappearance as competing ongoing processes.Analysis of simulated data showed that our estimators were unbiased and that their confidence intervals had good coverage.We also illustrate our approach on opportunistic carcass surveillance data acquired in 2010 during an anthrax outbreak in the plains zebra of Etosha National Park, Namibia.The methods developed here will allow researchers and managers to infer mortality rates from opportunistic surveillance data.
PLOS Biology | 2012
Steve E. Bellan; Juliet R. C. Pulliam; James Scott; Jonathan Dushoff
Modern infectious disease epidemiology builds on two independently developed fields: classical epidemiology and dynamical epidemiology. Over the past decade, integration of the two fields has increased in research practice, but training options within the fields remain distinct with few opportunities for integration in the classroom. The annual Clinic on the Meaningful Modeling of Epidemiological Data (MMED) at the African Institute for Mathematical Sciences has begun to address this gap. MMED offers participants exposure to a broad range of concepts and techniques from both epidemiological traditions. During MMED 2010 we developed a pedagogical approach that bridges the traditional distinction between classical and dynamical epidemiology and can be used at multiple educational levels, from high school to graduate level courses. The approach is hands-on, consisting of a real-time simulation of a stochastic outbreak in course participants, including realistic data reporting, followed by a variety of mathematical and statistical analyses, stemming from both epidemiological traditions. During the exercise, dynamical epidemiologists developed empirical skills such as study design and learned concepts of bias while classical epidemiologists were trained in systems thinking and began to understand epidemics as dynamic nonlinear processes. We believe this type of integrated educational tool will prove extremely valuable in the training of future infectious disease epidemiologists. We also believe that such interdisciplinary training will be critical for local capacity building in analytical epidemiology as Africa continues to produce new cohorts of well-trained mathematicians, statisticians, and scientists. And because the lessons draw on skills and concepts from many fields in biology--from pathogen biology, evolutionary dynamics of host--pathogen interactions, and the ecology of infectious disease to bioinformatics, computational biology, and statistics--this exercise can be incorporated into a broad array of life sciences courses.
BMJ | 2014
Steve E. Bellan; Juliet R. C. Pulliam; Jonathan Dushoff; Lauren Ancel Meyers
Randomised controlled trials (RCTs) offer the fastest and most rigorous assessment of vaccine efficacy.1 But they are ethical only if there is “clinical equipoise”—genuine uncertainty in the medical community about whether the experimental intervention will do more good than harm.2 We argue that Ebola virus vaccine RCTs can achieve clinical equipoise without sacrificing scientific rigour by providing trial participants who develop Ebola virus disease (EVD) with enhanced supportive care and access to experimental therapeutics. Most discussions have analysed Ebola vaccine and treatment …