Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Erin Holve is active.

Publication


Featured researches published by Erin Holve.


Medical Care | 2012

Opportunities and challenges for comparative effectiveness research (CER) with Electronic Clinical Data: a perspective from the EDM forum.

Erin Holve; Courtney Segal; Marianne Hamilton Lopez

Introduction:The Electronic Data Methods (EDM) Forum brings together perspectives from the Prospective Outcome Systems using Patient-specific Electronic data to Compare Tests and therapies (PROSPECT) studies, the Scalable Distributed Research Networks, and the Enhanced Registries projects. This paper discusses challenges faced by the research teams as part of their efforts to develop electronic clinical data (ECD) infrastructure to support comparative effectiveness research (CER). The findings reflect a set of opportunities for transdisciplinary learning, and will ideally enhance the transparency and generalizability of CER using ECD. Methods:Findings are based on 6 exploratory site visits conducted under naturalistic inquiry in the spring of 2011. Themes, challenges, and innovations were identified in the visit summaries through coding, keyword searches, and review for complex concepts. Results:The identified overarching challenges and emerging opportunities include: the substantial level of effort to establish and sustain data sharing partnerships; the importance of understanding the strengths and limitations of clinical informatics tools, platforms, and models that have emerged to enable research with ECD; the need for rigorous methods to assess data validity, quality, and context for multisite studies; and, emerging opportunities to achieve meaningful patient and consumer engagement and work collaboratively with multidisciplinary teams. Discussion:The new infrastructure must evolve to serve a diverse set of potential users and must scale to address a range of CER or patient-centered outcomes research (PCOR) questions. To achieve this aim—to improve the quality, transparency, and reproducibility of CER and PCOR-a high level of collaboration and support is necessary to foster partnership and best practices as part of the EDM Forum.


BMC Health Services Research | 2009

Health services research doctoral core competencies

Christopher B. Forrest; Diane P. Martin; Erin Holve; Anne Millman

This manuscript presents an initial description of doctoral level core competencies for health services research (HSR). The competencies were developed by a review of the literature, text analysis of institutional accreditation self-studies submitted to the Council on Education for Public Health, and a consensus conference of HSR educators from US educational institutions. The competencies are described in broad terms which reflect the unique expertise, interests, and preferred learning methods of academic HSR programs. This initial set of core competencies is published to generate further dialogue within and outside of the US about the most important learning objectives and methods for HSR training and to clarify the unique skills of HSR training program graduates.


Medical Care | 2012

Building the informatics infrastructure for comparative effectiveness research (CER): a review of the literature.

Marianne Hamilton Lopez; Erin Holve; Indra Neil Sarkar; Courtney Segal

Background:Technological advances in clinical informatics have made large amounts of data accessible and potentially useful for research. As a result, a burgeoning literature addresses efforts to bridge the fields of health services research and biomedical informatics. The Electronic Data Methods Forum review examines peer-reviewed literature at the intersection of comparative effectiveness research and clinical informatics. The authors are specifically interested in characterizing this literature and identifying cross-cutting themes and gaps in the literature. Methods:A 3-step systematic literature search was conducted, including a structured search of PubMed, manual reviews of articles from selected publication lists, and manual reviews of research activities based on prospective electronic clinical data. Two thousand four hundred thirty-five citations were identified as potentially relevant. Ultimately, a full-text review was performed for 147 peer-reviewed papers. Results:One hundred thirty-two articles were selected for inclusion in the review. Of these, 88 articles are the focus of the discussion in this paper. Three types of articles were identified, including papers that: (1) provide historical context or frameworks for using clinical informatics for research, (2) describe platforms and projects, and (3) discuss issues, challenges, and applications of natural language processing. In addition, 2 cross-cutting themes emerged: the challenges of conducting research in the absence of standardized ontologies and data collection; and unique data governance concerns related to the transfer, storage, deidentification, and access to electronic clinical data. Finally, the authors identified several current gaps on important topics such as the use of clinical informatics for cohort identification, cloud computing, and single point access to research data.


eGEMs (Generating Evidence & Methods to improve patient outcomes) | 2016

A Harmonized Data Quality Assessment Terminology and Framework for the Secondary Use of Electronic Health Record Data.

Michael Kahn; Tiffany J. Callahan; Juliana Barnard; Alan Bauck; Jeff Brown; Bruce N. Davidson; Hossein Estiri; Carsten Goerg; Erin Holve; Steven G. Johnson; Siaw-Teng Liaw; Marianne Hamilton-Lopez; Daniella Meeker; Toan C. Ong; Patrick B. Ryan; Ning Shang; Nicole Gray Weiskopf; Chunhua Weng; Meredith Nahm Zozus; Lisa M. Schilling

Objective: Harmonized data quality (DQ) assessment terms, methods, and reporting practices can establish a common understanding of the strengths and limitations of electronic health record (EHR) data for operational analytics, quality improvement, and research. Existing published DQ terms were harmonized to a comprehensive unified terminology with definitions and examples and organized into a conceptual framework to support a common approach to defining whether EHR data is ‘fit’ for specific uses. Materials and Methods: DQ publications, informatics and analytics experts, managers of established DQ programs, and operational manuals from several mature EHR-based research networks were reviewed to identify potential DQ terms and categories. Two face-to-face stakeholder meetings were used to vet an initial set of DQ terms and definitions that were grouped into an overall conceptual framework. Feedback received from data producers and users was used to construct a draft set of harmonized DQ terms and categories. Multiple rounds of iterative refinement resulted in a set of terms and organizing framework consisting of DQ categories, subcategories, terms, definitions, and examples. The harmonized terminology and logical framework’s inclusiveness was evaluated against ten published DQ terminologies. Results: Existing DQ terms were harmonized and organized into a framework by defining three DQ categories: (1) Conformance (2) Completeness and (3) Plausibility and two DQ assessment contexts: (1) Verification and (2) Validation. Conformance and Plausibility categories were further divided into subcategories. Each category and subcategory was defined with respect to whether the data may be verified with organizational data, or validated against an accepted gold standard, depending on proposed context and uses. The coverage of the harmonized DQ terminology was validated by successfully aligning to multiple published DQ terminologies. Discussion: Existing DQ concepts, community input, and expert review informed the development of a distinct set of terms, organized into categories and subcategories. The resulting DQ terms successfully encompassed a wide range of disparate DQ terminologies. Operational definitions were developed to provide guidance for implementing DQ assessment procedures. The resulting structure is an inclusive DQ framework for standardizing DQ assessment and reporting. While our analysis focused on the DQ issues often found in EHR data, the new terminology may be applicable to a wide range of electronic health data such as administrative, research, and patient-reported data. Conclusion: A consistent, common DQ terminology, organized into a logical framework, is an initial step in enabling data owners and users, patients, and policy makers to evaluate and communicate data quality findings in a well-defined manner with a shared vocabulary. Future work will leverage the framework and terminology to develop reusable data quality assessment and reporting methods.


Medical Care | 2013

Lessons from the Electronic Data Methods Forum: collaboration at the frontier of comparative effectiveness research, patient-centered outcomes research, and quality improvement.

Erin Holve; Ned Calonge

Background: The Electronic Data Methods (EDM) Forum, with support from the Agency for Healthcare Research and Quality, exists to advance knowledge and practice on the use of electronic clinical data (ECD) for comparative effectiveness research, patient-centered outcomes research, and quality improvement (QI). The EDM Forum facilitates collaboration between the Prospective Outcome Systems using Patient-specific Electronic data to Compare Tests and therapies, Scalable Distributed Research Network, and Enhanced registry projects funded by Agency for Healthcare Research and Quality. Objectives: This overview describes a second set of papers commissioned by the EDM Forum, published in this supplement to Medical Care. The papers that are included discuss challenges and innovations from the research and QI community using ECD. Conclusions: The papers in this supplement provide lessons learned based on experiences building transparent, scalable, reusable networks for research and QI. Through these papers, and a new open access e-journal, eGEMs, the EDM Forum is working to advance the science of health research and QI using ECD to improve patient outcomes.


Journal of Comparative Effectiveness Research | 2014

Infrastructure to support learning health systems: are we there yet? Innovative solutions and lessons learned from American Recovery and Reinvestment Act CER investments

Erin Holve; Courtney Segal

The 11 big health data networks participating in the AcademyHealth Electronic Data Methods Forum represent cutting-edge efforts to harness the power of big health data for research and quality improvement. This paper is a comparative case study based on site visits conducted with a subset of these large infrastructure grants funded through the Recovery Act, in which four key issues emerge that can inform the evolution of learning health systems, including the importance of acknowledging the challenges of scaling specialized expertise needed to manage and run CER networks; the delicate balance between privacy protections and the utility of distributed networks; emerging community engagement strategies; and the complexities of developing a robust business model for multi-use networks.


eGEMs (Generating Evidence & Methods to improve patient outcomes) | 2013

Ensuring Support for Research and Quality Improvement (QI) Networks: Four Pillars of Sustainability — An Emerging Framework

Erin Holve

Multi-institutional research and quality improvement (QI) projects using electronic clinical data (ECD) hold great promise for improving quality of care and patient outcomes but typically require significant infrastructure investments both to initiate and maintain the project over its duration. Consequently, it is important for these projects to think holistically about sustainability to ensure their long-term success. Four “pillars” of sustainability are discussed based on the experiences of EDM Forum grantees and other research and QI networks. These include trust and value, governance, management, and financial and administrative support. Two “foundational considerations,” adaptive capacity and policy levers, are also discussed.


eGEMs (Generating Evidence & Methods to improve patient outcomes) | 2016

Open Science and eGEMs: Our Role in Supporting a Culture of Collaboration in Learning Health Systems.

Erin Holve

“Open science” includes a variety of approaches to facilitate greater access to data and the information produced by processes of scientific inquiry. Recently, the health sciences community has been grappling with the issue of potential pathways and models to achieve the goals of open science—namely, to create and rapidly share reproducible health research. eGEMs’ continued dedication to and milestones regarding the publication of innovative, useful, and timely research to help contribute to the push towards open science is discussed, as well as the EDM Forum’s new data sharing platform, CIELO. Although strides have been made, there is still more work to be done to help health sciences community truly embrace open science.


Journal of Comparative Effectiveness Research | 2014

American Recovery and Reinvestment Act-comparative effectiveness research infrastructure investments: emerging data resources, tools and publications

Courtney Segal; Erin Holve

The Recovery Act provided a substantial, one-time investment in data infrastructure for comparative effectiveness research (CER). A review of the publications, data, and tools developed as a result of this support has informed understanding of the level of effort undertaken by these projects. Structured search queries, as well as outreach efforts, were conducted to identify and review resources from American Recovery and Reinvestment Act of 2009 CER projects building electronic clinical data infrastructure. The findings from this study provide a spectrum of productivity across a range of topics and settings. A total of 451 manuscripts published in 192 journals, and 141 data resources and tools were identified and address gaps in evidence on priority populations, conditions, and the infrastructure needed to support CER.


Journal of Medical Internet Research | 2017

Enabling Open Science for Health Research: Collaborative Informatics Environment for Learning on Health Outcomes (CIELO)

Philip R. O. Payne; Omkar Lele; Beth Johnson; Erin Holve

Background There is an emergent and intensive dialogue in the United States with regard to the accessibility, reproducibility, and rigor of health research. This discussion is also closely aligned with the need to identify sustainable ways to expand the national research enterprise and to generate actionable results that can be applied to improve the nation’s health. The principles and practices of Open Science offer a promising path to address both goals by facilitating (1) increased transparency of data and methods, which promotes research reproducibility and rigor; and (2) cumulative efficiencies wherein research tools and the output of research are combined to accelerate the delivery of new knowledge in proximal domains, thereby resulting in greater productivity and a reduction in redundant research investments. Objectives AcademyHealth’s Electronic Data Methods (EDM) Forum implemented a proof-of-concept open science platform for health research called the Collaborative Informatics Environment for Learning on Health Outcomes (CIELO). Methods The EDM Forum conducted a user-centered design process to elucidate important and high-level requirements for creating and sustaining an open science paradigm. Results By implementing CIELO and engaging a variety of potential users in its public beta testing, the EDM Forum has been able to elucidate a broad range of stakeholder needs and requirements related to the use of an open science platform focused on health research in a variety of “real world” settings. Conclusions Our initial design and development experience over the course of the CIELO project has provided the basis for a vigorous dialogue between stakeholder community members regarding the capabilities that will add the greatest value to an open science platform for the health research community. A number of important questions around user incentives, sustainability, and scalability will require further community dialogue and agreement.

Collaboration


Dive into the Erin Holve's collaboration.

Top Co-Authors

Avatar

Courtney Segal

University of Washington

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jon R. Gabel

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Diane Rowland

Kaiser Family Foundation

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gary Claxton

Kaiser Family Foundation

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge