Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Julia Lavenberg is active.

Publication


Featured researches published by Julia Lavenberg.


Journal of Nursing Administration | 2014

Hourly rounding to improve nursing responsiveness: a systematic review.

Matthew Mitchell; Julia Lavenberg; Rebecca L. Trotta; Craig A. Umscheid

The aims of this study were to synthesize the evidence concerning the effect of hourly rounding programs on patient satisfaction with nursing care and discuss implications for nurse administrators. BACKGROUND: Patient satisfaction is a key metric that influences both hospital ratings and reimbursement. Studies have suggested that purposeful nursing rounds can improve patient satisfaction, but the evidence to date has not been systematically examined. METHODS: A systematic review of published literature and GRADE analysis of evidence regarding nursing rounds were conducted. RESULTS: There is little consistency in how results of hourly rounds were measured, precluding quantitative analysis. There is moderate-strength evidence that hourly rounding programs improve patients’ perception of nursing responsiveness. There is also moderate-strength evidence that these programs reduce patient falls and call light use. CONCLUSIONS: Nurse administrators should consider implementing an hourly rounding program while controlled trials discern the most cost-effective approach.


American Behavioral Scientist | 2004

Estimating the Effects of Interventions That are Deployed in Many Places: Place-Randomized Trials

Robert F. Boruch; Henry May; Herbert M. Turner; Julia Lavenberg; Anthony Petrosino; Dorothy de Moya; Jeremy Grimshaw; Ellen Foley

Place-randomized trials have been mounted in a variety of countries to estimate the relative effects of interventions that are intended to ameliorate problems or improve conditions in organizations and geopolitical jurisdictions. This article presents studies in which villages, police hot spots, housing developments, hospital units, schools, and other entities are the units of random allocation. The challenges to such work, approaches to meeting them, and the value added of such trials are outlined. The scientific value added includes better evidence on what works at the macro level. Web-oriented registers of such trials are being developed by the Campbell Collaboration.


Journal of Hospital Medicine | 2014

Assessing preventability in the quest to reduce hospital readmissions

Julia Lavenberg; Brian F Leas; Craig A. Umscheid; Kendal Williams; David R. Goldmann; Sunil Kripalani

Hospitals devote significant human and capital resources to eliminate hospital readmissions, prompted most recently by the Centers for Medicare and Medicaid Services (CMS) financial penalties for higher-than-expected readmission rates. Implicit in these efforts are assumptions that a significant proportion of readmissions are preventable, and preventable readmissions can be identified. Yet, no consensus exists in the literature regarding methods to determine which readmissions are reasonably preventable. In this article, we examine strengths and limitations of the CMS readmission metric, explore how preventable readmissions have been defined and measured, and discuss implications for readmission reduction efforts. Drawing on our clinical, research and operational experiences, we offer suggestions to address the key challenges in moving forward to measure and reduce preventable readmissions.


Annals of The American Academy of Political and Social Science | 2003

Populating an International Web-Based Randomized Trials Register in the Social, Behavioral, Criminological, and Education Sciences

Herbert M. Turner; Robert F. Boruch; Anthony Petrosino; Julia Lavenberg; Dorothy de Moya; Hannah R. Rothstein

Underlying the work of the Campbell Collaboration (C2) is the Sociological, Psychological, Educational, and Criminological Trials Register (C2-SPECTR). A Web-accessible database, C2-SPECTR is unique in the world. With more than 11,600 citations, it is an international register on randomized controlled trials (RCTs) or possible trials. This article describes the framework for populating C2-SPECTR, other registers that are prospective, and the practical issues of implementation. The authors discuss the growing importance of RCTs and the recent histories of organizations that have influenced this growth—the Cochrane Collaboration, the C2, and the What Works Clearinghouse. Next, the authors describe the origins of C2-SPECTR and plans to populate it and a prospective register. The authors conclude with plans for implementing the surveillance systems and the anticipated challenges in actualizing these plans.


Evidence & Policy: A Journal of Research, Debate and Practice | 2006

Information retrieval and the role of the information specialist in producing high-quality systematic reviews in the social, behavioural and education sciences

C. Anne Wade; Herbert M. Turner; Hannah R. Rothstein; Julia Lavenberg

The International Campbell Collaboration (C2) prepares, maintains, and disseminates high-quality systematic reviews in the social, behavioural and educational sciences. As part of its effort to ensure that systematic reviews are based on a set of systematic, transparent and replicable procedures, C2 has produced a set of policy briefs. One of these, the C2 Information Retrieval Policy Brief proposes policies for searching the literature for C2 reviews, addresses key issues and challenges faced by C2 reviewers, and recommends working with an Information Specialist (IS). This article illustrates how the information retrieval issues raised in the Brief have been addressed by one C2 review team, through the inclusion of an IS as an integral member of their review team. This unique approach recognises that information retrieval is a continuous and important process, requiring the ongoing expertise of a professional. Cost implications for the provision of ongoing support by an IS are briefly addressed, along with various alternative approaches.


Archive | 2016

The Penn Medicine Center for Evidence-Based Practice: Supporting the Quality, Safety, and Value of Patient Care Through Evidence-Based Practice at the Systems Level (USA)

Craig A. Umscheid; Matthew Mitchell; Brian F Leas; Julia Lavenberg; Kendal Williams; Patrick J. Brennan

The University of Pennsylvania Health System Center for Evidence-based Practice (CEP) was established in 2006 by the Office of the Chief Medical Officer to support the quality, safety and value of patient care at Penn through evidence-based practice. To accomplish this mission, CEP performs rapid systematic reviews of the scientific literature to inform local practice and policy, translates evidence into practice through the use of computerized clinical decision support (CDS) interventions and clinical pathways, and offers education in evidence-based decision making to trainees, staff and faculty. The Center includes a physician director, three research analysts, six physician and nurse liaisons, a biostatistician, a health economist and an administrator, and collaborates closely with librarians and staff in informatics and quality improvement. To date, CEP has completed over 300 rapid reviews for clinical and administrative leaders on topics ranging from formulary management to device purchasing to development of best clinical practices. CEP has also created approximately 25 CDS tools to integrate evidence into practice, and is developing a pathways program to support standardization of care throughout our growing healthcare system. Lastly, CEP has enhanced the capacity for evidence-based decision making through a novel EBM curriculum for medical students, as well as courses and workshops for housestaff, fellows, faculty, advance practice providers and nurses. Our experience suggests hospital EPCs can efficiently synthesize and implement evidence addressing a range of clinical topics for diverse stakeholders, influence local decision making, and foster a culture of evidence-based practice, strengthening the quality, safety, and value of care provided.


BMJ Quality & Safety | 2013

068 Integrating guidelines into local clinical practice and policy using hospital-based health technology assessment

Matthew Mitchell; Brian F Leas; Julia Lavenberg; Kendal Williams; C Umscheid

Background Most existing centres for health technology assessment (HTA) are associated with payers or government agencies, and review and analyse emerging and costly technologies. Yet, such centres can exist within individual medical centres as well, and can use HTA methods locally to synthesise, disseminate and implement best clinical practices to improve the quality, safety and value of patient care. Objectives Describe the structure, processes and outcomes of a model of hospital-based HTA (HB-HTA) in the US, such that it can be applied elsewhere. Methods Our academic medical centre established the centre for Evidence-based Practice (CEP) in 2006. CEP synthesises guidelines and studies for clinical and administrative leaders to inform decision-making, integrates select syntheses into practice through clinical decision support (CDS), and provides education in evidence-based practice. Local utilisation and cost data are incorporated where appropriate. Results Nearly 200 evidence reports have been completed to date, and over 35 reports have been integrated into CDS. The median time from project opening to first draft is 4 weeks. CEP also contracts with external organisations such as the CDC and AHRQ on systematic reviews and guidelines. Discussion To complete reviews rapidly, we work closely with requestors to define the questions up front and limit the scope, use experienced analysts to perform high yield searches with single study reviews and extraction, and use best available evidence and existing guidelines and reviews. Implications for Guideline Developers/Users An HB-HTA centre can develop, adapt and implement guidelines locally to support a culture of evidence-based practice and decision-making.


Cochrane Database of Systematic Reviews | 2013

'Scared Straight' and other juvenile awareness programs for preventing juvenile delinquency

Anthony Petrosino; Carolyn Turpin-Petrosino; Meghan E. Hollis-Peel; Julia Lavenberg


Campbell Systematic Reviews | 2013

Scared Straight and Other Juvenile Awareness Programs for Preventing Juvenile Delinquency: A Systematic Review

Anthony Petrosino; Carolyn Turpin-Petrosino; Meghan E. Hollis-Peel; Julia Lavenberg


Archive | 2007

Systematic Reviews and Meta-Analyses: Best Evidence on "What Works" for Criminal Justice Decision Makers*

Anthony Petrosino; Julia Lavenberg

Collaboration


Dive into the Julia Lavenberg's collaboration.

Top Co-Authors

Avatar

Matthew Mitchell

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Anthony Petrosino

American Academy of Arts and Sciences

View shared research outputs
Top Co-Authors

Avatar

Brian F Leas

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Craig A. Umscheid

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Herbert M. Turner

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Kendal Williams

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David R. Goldmann

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Dorothy de Moya

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge