Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jelena Zurovac is active.

Publication


Featured researches published by Jelena Zurovac.


Journal of Comparative Effectiveness Research | 2014

Lessons from comparative effectiveness research methods development projects funded under the Recovery Act

Jelena Zurovac; Dominick Esposito

BACKGROUND The American Recovery and Reinvestment Act of 2009 (ARRA) directed nearly US


eGEMs (Generating Evidence & Methods to improve patient outcomes) | 2017

Analytical Methods for a Learning Health System: 3. Analysis of Observational Studies

Michael A. Stoto; Michael Oakes; Elizabeth A. Stuart; Randall S. Brown; Jelena Zurovac; Elisa L. Priest

29.2 million to comparative effectiveness research (CER) methods development. AIM To help inform future CER methods investments, we describe the ARRA CER methods projects, identify barriers to this research and discuss the alignment of topics with published methods development priorities. METHODS We used several existing resources and held discussions with ARRA CER methods investigators. RESULTS & CONCLUSION Although funded projects explored many identified priority topics, investigators noted that much work remains. For example, given the considerable investments in CER data infrastructure, the methods development field can benefit from additional efforts to educate researchers about the availability of new data sources and about how best to apply methods to match their research questions and data.


eGEMs (Generating Evidence & Methods to improve patient outcomes) | 2017

Analytical Methods for a Learning Health System: 1. Framing the Research Question

Michael A. Stoto; Michael Oakes; Elizabeth A. Stuart; Lucy Savitz; Elisa L. Priest; Jelena Zurovac

The third paper in a series on how learning health systems can use routinely collected electronic health data (EHD) to advance knowledge and support continuous learning, this review describes how analytical methods for individual-level electronic health data EHD, including regression approaches, interrupted time series (ITS) analyses, instrumental variables, and propensity score methods, can also be used to address the question of whether the intervention “works.” The two major potential sources of bias in non-experimental studies of health care interventions are that the treatment groups compared do not have the same probability of treatment or exposure and the potential for confounding by unmeasured covariates. Although very different, the approaches presented in this chapter are all based on assumptions about data, causal relationships, and biases. For instance, regression approaches assume that the relationship between the treatment, outcome, and other variables is properly specified, all of the variables are available for analysis (i.e., no unobserved confounders) and measured without error, and that the error term is independent and identically distributed. The instrumental variables approach requires identifying an instrument that is related to the assignment of treatment but otherwise has no direct on the outcome. Propensity score methods approaches, on the other hand, assume that there are no unobserved confounders. The epidemiological designs discussed also make assumptions, for instance that individuals can serve as their own control. To properly address these assumptions, analysts should conduct sensitivity analyses within the assumptions of each method to assess the potential impact of what cannot be observed. Researchers also should analyze the same data with different analytical approaches that make alternative assumptions, and to apply the same methods to different data sets. Finally, different analytical methods, each subject to different biases, should be used in combination and together with different designs, to limit the potential for bias in the final results.


Journal of Policy Analysis and Management | 2018

THE INTERNAL AND EXTERNAL VALIDITY OF THE REGRESSION DISCONTINUITY DESIGN: A META‐ANALYSIS OF 15 WITHIN‐STUDY COMPARISONS

Duncan Chaplin; Thomas D. Cook; Jelena Zurovac; Jared Coopersmith; Mariel M. Finucane; Lauren Vollmer; Rebecca E. Morris

Learning health systems use routinely collected electronic health data (EHD) to advance knowledge and support continuous learning. Even without randomization, observational studies can play a central role as the nation’s health care system embraces comparative effectiveness research and patient-centered outcomes research. However, neither the breadth, timeliness, volume of the available information, nor sophisticated analytics, allow analysts to confidently infer causal relationships from observational data. However, depending on the research question, careful study design and appropriate analytical methods can improve the utility of EHD. The introduction to a series of four papers, this review begins with a discussion of the kind of research questions that EHD can help address, noting how different evidence and assumptions are needed for each. We argue that when the question involves describing the current (and likely future) state of affairs, causal inference is not relevant, so randomized clinical trials (RCTs) are not necessary. When the question is whether an intervention improves outcomes of interest, causal inference is critical, but appropriately designed and analyzed observational studies can yield valid results that better balance internal and external validity than typical RCTs. When the question is one of translation and spread of innovations, a different set of questions comes into play: How and why does the intervention work? How can a model be amended or adapted to work in new settings? In these “delivery system science” settings, causal inference is not the main issue, so a range of quantitative, qualitative, and mixed research designs are needed. We then describe why RCTs are regarded as the gold standard for assessing cause and effect, how alternative approaches relying on observational data can be used to the same end, and how observational studies of EHD can be effective complements to RCTs. We also describe how RCTs can be a model for designing rigorous observational studies, building an evidence base through iterative studies that build upon each other (i.e., confirmation across multiple investigations).


Health Services Research | 2016

Testing the Replicability of a Successful Care Management Program: Results from a Randomized Trial and Likely Explanations for Why Impacts Did Not Replicate

Greg Peterson; Jelena Zurovac; Randall S. Brown; Kenneth D. Coburn; Patricia A. Markovich; Sherry Marcantonio; William D. Clark; Anne Mutti; Cara Stepanczuk


Archive | 2014

Lessons from CER Methods Development Projects Funded Under the Recovery Act

Jelena Zurovac; Dominick Esposito


Mathematica Policy Research Reports | 2014

Effectiveness of Alternative Ways of Implementing Care Management Components in Medicare D-SNPs: The Brand New Day Study

Jelena Zurovac; Randy Brown; Bob Schmitz; Richard Chapman


Mathematica Policy Research Reports | 2013

Using Multifactorial Experiments for Comparative Effectiveness Research in Physician Practices with Electronic Health Record

Jelena Zurovac; Lorenzo Moreno; Jesse C. Crosson; Randall S. Brown; Robert Schmitz


Mathematica Policy Research Reports | 2013

Effectiveness of Alternative Ways of Implementing Care Coordination Components in Medicare D-SNPs

Jelena Zurovac; Randy Brown; Bob Schmitz

Collaboration


Dive into the Jelena Zurovac's collaboration.

Top Co-Authors

Avatar

Randall S. Brown

Mathematica Policy Research

View shared research outputs
Top Co-Authors

Avatar

Dominick Esposito

Mathematica Policy Research

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Anne Mutti

Mathematica Policy Research

View shared research outputs
Top Co-Authors

Avatar

Cara Stepanczuk

Mathematica Policy Research

View shared research outputs
Top Co-Authors

Avatar

Duncan Chaplin

Mathematica Policy Research

View shared research outputs
Top Co-Authors

Avatar

Greg Peterson

Mathematica Policy Research

View shared research outputs
Researchain Logo
Decentralizing Knowledge