Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Hawre Jalal is active.

Publication


Featured researches published by Hawre Jalal.


Journal of the National Cancer Institute | 2012

Network Meta-analysis of Margin Threshold for Women With Ductal Carcinoma In Situ

Shi-Yi Wang; Haitao Chu; Tatyana Shamliyan; Hawre Jalal; Karen M. Kuntz; Robert L. Kane; Beth A Virnig

BACKGROUND Negative margins are associated with reduced risk of ipsilateral breast tumor recurrence (IBTR) for women with ductal carcinoma in situ (DCIS) treated with breast-conserving surgery (BCS). However, there is no consensus about the best minimum margin width. METHODS We searched the PubMed database for studies of DCIS published in English between January 1970 and July 2010 and examined the relationship between IBTR and margin status after BCS for DCIS. Women with DCIS were stratified into two groups, BCS with or without radiotherapy. We used frequentist and Bayesian approaches to estimate the odds ratios (OR) of IBTR for groups with negative margins and positive margins. We further examined specific margin thresholds using mixed treatment comparisons and meta-regression techniques. All statistical tests were two-sided. RESULTS We identified 21 studies published in 24 articles. A total of 1066 IBTR events occurred in 7564 patients, including BCS alone (565 IBTR events in 3098 patients) and BCS with radiotherapy (501 IBTR events in 4466 patients). Compared with positive margins, negative margins were associated with reduced risk of IBTR in patients with radiotherapy (OR = 0.46, 95% credible interval [CrI] = 0.35 to 0.59), and in patients without radiotherapy (OR = 0.34, 95% CrI = 0.24 to 0.47). Compared with patients with positive margins, the risk of IBTR for patients with negative margins was smaller (negative margin >0 mm, OR = 0.45, 95% CrI = 0.38 to 0.53; >2 mm, OR = 0.38, 95% CrI = 0.28 to 0.51; >5 mm, OR = 0.55, 95% CrI = 0.15 to 1.30; and >10 mm, OR = 0.17, 95% CrI = 0.12 to 0.24). Compared with a negative margin greater than 2 mm, a negative margin of at least 10 mm was associated with a lower risk of IBTR (OR = 0.46, 95% CrI = 0.29 to 0.69). We found a probability of .96 that a negative margin threshold greater than 10 mm is the best option compared with other margin thresholds. CONCLUSIONS Negative surgical margins should be obtained for DCIS patients after BCS regardless of radiotherapy. Within cosmetic constraint, surgeons should attempt to achieve negative margins as wide as possible in their first attempt. More studies are needed to understand whether margin thresholds greater than 10 mm are warranted.


Medical Decision Making | 2015

Computing Expected Value of Partial Sample Information from Probabilistic Sensitivity Analysis Using Linear Regression Metamodeling

Hawre Jalal; Jeremy D. Goldhaber-Fiebert; Karen M. Kuntz

Decision makers often desire both guidance on the most cost-effective interventions given current knowledge and also the value of collecting additional information to improve the decisions made (i.e., from value of information [VOI] analysis). Unfortunately, VOI analysis remains underused due to the conceptual, mathematical, and computational challenges of implementing Bayesian decision-theoretic approaches in models of sufficient complexity for real-world decision making. In this study, we propose a novel practical approach for conducting VOI analysis using a combination of probabilistic sensitivity analysis, linear regression metamodeling, and unit normal loss integral function—a parametric approach to VOI analysis. We adopt a linear approximation and leverage a fundamental assumption of VOI analysis, which requires that all sources of prior uncertainties be accurately specified. We provide examples of the approach and show that the assumptions we make do not induce substantial bias but greatly reduce the computational time needed to perform VOI analysis. Our approach avoids the need to analytically solve or approximate joint Bayesian updating, requires only one set of probabilistic sensitivity analysis simulations, and can be applied in models with correlated input parameters.


Medical Decision Making | 2013

Linear regression metamodeling as a tool to summarize and present simulation model results.

Hawre Jalal; Bryan Dowd; François Sainfort; Karen M. Kuntz

Background/Objective. Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. Methods. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. Results. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Conclusions. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.


The journal of the economics of ageing | 2016

Forecasting Trends in Disability in a Super-Aging Society: Adapting the Future Elderly Model to Japan

Brian Chen; Hawre Jalal; Hideki Hashimoto; Sze-Chuan Suen; Karen Eggleston; Michael Hurley; Lena Schoemaker; Jay Bhattacharya

Japan has experienced pronounced population aging, and now has the highest proportion of elderly adults in the world. Yet few projections of Japans future demography go beyond estimating population by age and sex to forecast the complex evolution of the health and functioning of the future elderly. This study estimates a new state-transition microsimulation model - the Japanese Future Elderly Model (FEM) - for Japan. We use the model to forecast disability and health for Japans future elderly. Our simulation suggests that by 2040, over 27 percent of Japans elderly will exhibit 3 or more limitations in IADLs and social functioning; almost one in 4 will experience difficulties with 3 or more ADLs; and approximately one in 5 will suffer limitations in cognitive or intellectual functioning. Since the majority of the increase in disability arises from the aging of the Japanese population, prevention efforts that reduce age-specific morbidity can help reduce the burden of disability but may have only a limited impact on reducing the overall prevalence of disability among Japanese elderly. While both age and morbidity contribute to a predicted increase in disability burden among elderly Japanese in the future, our simulation results suggest that the impact of population aging exceeds the effect of age-specific morbidity on increasing disability in Japans future.


Arthritis Care and Research | 2016

Cost-Effectiveness of Triple Therapy Versus Etanercept Plus Methotrexate in Early Aggressive Rheumatoid Arthritis

Hawre Jalal; James R. O'Dell; S. Louis Bridges; Stacey S. Cofield; Jeffrey R. Curtis; Ted R. Mikuls; Larry W. Moreland; Kaleb Michaud

To evaluate the cost‐effectiveness of all 4 interventions in the Treatment of Early Aggressive Rheumatoid Arthritis (TEAR) clinical trial: immediate triple (IT), immediate etanercept (IE), step‐up triple (ST), and step‐up etanercept (SE). Step‐up interventions started with methotrexate and added either etanercept or sulfasalazine plus hydroxychloroquine to patients with persistent disease activity.


Journal of the American Heart Association | 2015

Cost-Effectiveness of a Statewide Campaign to Promote Aspirin Use for Primary Prevention of Cardiovascular Disease.

Tzeyu L. Michaud; Jean M. Abraham; Hawre Jalal; Russell V. Luepker; Sue Duval; Alan T. Hirsch

Background The U.S. Preventive Services Task Force in 2009 recommended increased aspirin use for primary prevention of cardiovascular disease (CVD) in men ages 45 to 79 years and women ages 55 to 79 years for whom benefit outweighs risk. This study estimated the clinical efficacy and cost‐effectiveness of a statewide public and health professional awareness campaign to increase regular aspirin use among the target population in Minnesota to reduce first CVD events. Methods and Results A state‐transition Markov model was developed, adopting a payer perspective and lifetime time horizon. The main outcomes of interest were quality‐adjusted life years, costs, and the number of CVD events averted among those without a prior CVD history. The model was based on real‐world data about campaign effectiveness from representative state‐specific aspirin use and event rates, and estimates from the scholarly literature. Implementation of a campaign was predicted to avert 9874 primary myocardial infarctions in men and 1223 primary ischemic strokes in women in the target population. Increased aspirin use was associated with as many as 7222 more major gastrointestinal bleeding episodes. The cost‐effectiveness analysis indicated cost‐saving results for both the male and female target populations. Conclusions Using current U.S. Preventive Services Task Force recommendations, a state public and health professional awareness campaign would likely provide clinical benefit and be economically attractive. With clinician adjudication of individual benefit and risk, mechanisms can be made available that would facilitate achievement of aspirins beneficial impact on lowering risk of primary CVD events, with minimization of adverse outcomes.


Medical Decision Making | 2017

An Overview of R in Health Decision Sciences.

Hawre Jalal; Petros Pechlivanoglou; Eline M. Krijkamp; Fernando Alarid-Escudero; Eva A. Enns; M. G. Myriam Hunink

As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R’s popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.


Medical Decision Making | 2016

Some Health States Are Better Than Others Using Health State Rank Order to Improve Probabilistic Analyses

Jeremy D. Goldhaber-Fiebert; Hawre Jalal

Background. Probabilistic sensitivity analyses (PSA) may lead policy makers to take nonoptimal actions due to misestimates of decision uncertainty caused by ignoring correlations. We developed a method to establish joint uncertainty distributions of quality-of-life (QoL) weights exploiting ordinal preferences over health states. Methods. Our method takes as inputs independent, univariate marginal distributions for each QoL weight and a preference ordering. It establishes a correlation matrix between QoL weights intended to preserve the ordering. It samples QoL weight values from their distributions, ordering them with the correlation matrix. It calculates the proportion of samples violating the ordering, iteratively adjusting the correlation matrix until this proportion is below an arbitrarily small threshold. We compare our method with the uncorrelated method and other methods for preserving rank ordering in terms of violation proportions and fidelity to the specified marginal distributions along with PSA and expected value of partial perfect information (EVPPI) estimates, using 2 models: 1) a decision tree with 2 decision alternatives and 2) a chronic hepatitis C virus (HCV) Markov model with 3 alternatives. Results. All methods make tradeoffs between violating preference orderings and altering marginal distributions. For both models, our method simultaneously performed best, with largest performance advantages when distributions reflected wider uncertainty. For PSA, larger changes to the marginal distributions induced by existing methods resulted in differing conclusions about which strategy was most likely optimal. For EVPPI, both preference order violations and altered marginal distributions caused existing methods to misestimate the maximum value of seeking additional information, sometimes concluding that there was no value. Conclusions. Analysts can characterize the joint uncertainty in QoL weights to improve PSA and value-of-information estimates using Open Source implementations of our method.


Medical Decision Making | 2018

Microsimulation Modeling for Health Decision Sciences Using R: A Tutorial

Eline M. Krijkamp; Fernando Alarid-Escudero; Eva A. Enns; Hawre Jalal; M. G. Myriam Hunink; Petros Pechlivanoglou

Microsimulation models are becoming increasingly common in the field of decision modeling for health. Because microsimulation models are computationally more demanding than traditional Markov cohort models, the use of computer programming languages in their development has become more common. R is a programming language that has gained recognition within the field of decision modeling. It has the capacity to perform microsimulation models more efficiently than software commonly used for decision modeling, incorporate statistical analyses within decision models, and produce more transparent models and reproducible results. However, no clear guidance for the implementation of microsimulation models in R exists. In this tutorial, we provide a step-by-step guide to build microsimulation models in R and illustrate the use of this guide on a simple, but transferable, hypothetical decision problem. We guide the reader through the necessary steps and provide generic R code that is flexible and can be adapted for other models. We also show how this code can be extended to address more complex model structures and provide an efficient microsimulation approach that relies on vectorization solutions.


Medical Decision Making | 2017

A Gaussian Approximation Approach for Value of Information Analysis

Hawre Jalal; Fernando Alarid-Escudero

Most decisions are associated with uncertainty. Value of information (VOI) analysis quantifies the opportunity loss associated with choosing a suboptimal intervention based on current imperfect information. VOI can inform the value of collecting additional information, resource allocation, research prioritization, and future research designs. However, in practice, VOI remains underused due to many conceptual and computational challenges associated with its application. Expected value of sample information (EVSI) is rooted in Bayesian statistical decision theory and measures the value of information from a finite sample. The past few years have witnessed a dramatic growth in computationally efficient methods to calculate EVSI, including metamodeling. However, little research has been done to simplify the experimental data collection step inherent to all EVSI computations, especially for correlated model parameters. This article proposes a general Gaussian approximation (GA) of the traditional Bayesian updating approach based on the original work by Raiffa and Schlaifer to compute EVSI. The proposed approach uses a single probabilistic sensitivity analysis (PSA) data set and involves 2 steps: 1) a linear metamodel step to compute the EVSI on the preposterior distributions and 2) a GA step to compute the preposterior distribution of the parameters of interest. The proposed approach is efficient and can be applied for a wide range of data collection designs involving multiple non-Gaussian parameters and unbalanced study designs. Our approach is particularly useful when the parameters of an economic evaluation are correlated or interact.

Collaboration


Dive into the Hawre Jalal's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Eva A. Enns

University of Minnesota

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Eline M. Krijkamp

Erasmus University Rotterdam

View shared research outputs
Top Co-Authors

Avatar

M. G. Myriam Hunink

Erasmus University Rotterdam

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Brian Chen

University of South Carolina

View shared research outputs
Researchain Logo
Decentralizing Knowledge