Elayne Kornblatt Phillips
University of Virginia
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Elayne Kornblatt Phillips.
Cancer Epidemiology, Biomarkers & Prevention | 2006
Mary E. Ropka; Jennifer Wenzel; Elayne Kornblatt Phillips; Mir S. Siadaty; John T. Philbrick
Purpose: Individuals and families dealing with the possibility of hereditary cancer risk face numerous decisions, including whether to obtain genetic testing. The purpose of this article is to determine what is known about the rate at which people obtain cancer genetic testing. Methods: Using MEDLINE, CINAHL, and PSYCHINFO plus reviewing reference lists of relevant articles, we identified 40 studies in May 2002 that addressed breast cancer–related decisions, enrolled adult participants, were published in 1990 or more recently, were peer-reviewed primary clinical studies, addressed genetic testing either alone or in combination with genetic counseling, and reported rates at which participants showed interest in and/or underwent cancer genetic testing. Information regarding study design, participants, and genetic testing uptake rates was recorded. Each article was reviewed for methodologic quality using a flexible quality review system applicable to all study types. Results: Of the 40 studies, 25 provided information about hypothetical genetic testing decisions, 14 about real decisions, and 1 about both. Mean hypothetical uptake was 66% (range, 20-96%) and real uptake was 59% (range, 25-96%). Multivariate logistic regression analyses found that decision type (real/hypothetical), personal and family history of breast cancer, and variability in sampling strategy, recruitment setting, and criteria for real and hypothetical uptake were independently associated with uptake. Our systematic review identified additional explanations for uptake variability (investigator influences, small sample sizes, variability in target populations, lack of clearly described sampling strategies, sampling methods open to bias, and variability in reporting associated risk factors). Conclusion: In addition to clinical characteristics, research methodologic issues are likely to be major determinants of variability in published breast cancer genetic testing uptake rates. An understanding of these issues will clarify to clinicians why their clinical experience may not be congruent with published rates and help guide future research. (Cancer Epidemiol Biomarkers Prev 2006;15(5):840–55)
Journal of The American College of Surgeons | 2010
Janine Jagger; Ramon Berguer; Elayne Kornblatt Phillips; Ginger Parker; Ahmed Gomaa
BACKGROUND The operating room is a high-risk setting for occupational sharps injuries and bloodborne pathogen exposure. The requirement to provide safety-engineered devices, mandated by the Needlestick Safety and Prevention Act of 2000, has received scant attention in surgical settings. STUDY DESIGN We analyzed percutaneous injury surveillance data from 87 hospitals in the United States from 1993 through 2006, comparing injury rates in surgical and nonsurgical settings before and after passage of the law. We identified devices and circumstances associated with injuries among surgical team members. RESULTS Of 31,324 total sharps injuries, 7,186 were to surgical personnel. After the legislation, injury rates in nonsurgical settings dropped 31.6%, but increased 6.5% in surgical settings. Most injuries were caused by suture needles (43.4%), scalpel blades (17%), and syringes (12%). Three-quarters of injuries occurred during use or passing of devices. Surgeons and residents were most often original users of the injury-causing devices; nurses and surgical technicians were typically injured by devices originally used by others. CONCLUSIONS Despite legislation and advances in sharps safety technology, surgical injuries continued to increase during the period that nonsurgical injuries decreased significantly. Hospitals should comply with requirements for the adoption of safer surgical technologies, and promote policies and practices shown to substantially reduce blood exposures to surgeons, their coworkers, and patients. Although decisions affecting the safety of the surgical team lie primarily in the surgeons hands, there are also roles for administrators, educators, and policy makers.
The New England Journal of Medicine | 2012
Elayne Kornblatt Phillips; Mark R. Conaway; Janine Jagger
According to this analysis of needlestick injuries in a sample of U.S. hospitals before and after passage of the NSPA in 2000, the number of percutaneous injuries per 100 full-time hospital employees declined after enactment of the legislation.
Infection Control and Hospital Epidemiology | 2013
Elayne Kornblatt Phillips; Mark R. Conaway; Ginger Parker; Jane Perry; Janine Jagger
OBJECTIVE Measuring the effect of the Needlestick Safety and Prevention Act (NSPA) is challenging. No agreement exists on a common denominator for calculating injury rates. Does it make a difference? How are the law and safety-engineered devices related? What is the effect on injuries and costs? This study examines those issues in assessing the impact of the legislation on hospital worker percutaneous injuries. METHODS Using a historic prospective design, we analyzed injury data from 85 hospitals. Injury rates were calculated per 100 full-time equivalents, 100 staffed beds, and 100 admissions each year from 1995 to 2005. We compared changes for each denominator. We measured the proportion of the injury rate attributed to safety-engineered devices. Finally, we estimated a national change in injuries and associated costs. RESULTS For all denominators, a precipitous drop in injury rates of greater than one-third ([Formula: see text]) occurred in 2001, immediately following the legislation. The decrease was sustained through 2005. Concomitant with the decrease in rates, the proportion of injuries from safety-engineered devices nearly tripled ([Formula: see text]) across all denominators. We estimated annual reductions of more than 100,000 sharps injuries at a cost savings of
Infection Control and Hospital Epidemiology | 2007
Elayne Kornblatt Phillips; Alex Owusu-Ofori; Janine Jagger
69-
Medical Care | 1990
Brent C. Williams; Elayne Kornblatt Phillips; James C. Torner; Audrey A. Irvine
415 million. CONCLUSIONS While the data cannot demonstrate cause and effect, the evidence suggests a reduction in hospital worker injury rates related to the NSPA, regardless of denominator. It also suggests an association between the increase in safety-engineered devices and the reduction in overall injury rates. The decreases observed translate into significant reductions in injuries and associated costs.
AORN Journal | 2011
Janine Jagger; Ramon Berguer; Elayne Kornblatt Phillips; Ginger Parker; Ahmed Gomaa
To document the frequency and circumstances of bloodborne pathogen exposures among surgeons in sub-Saharan Africa, we surveyed surgeons attending the 2006 Pan-African Association of Surgeons conference. During the previous year, surgeons sustained a mean of 3.1 percutaneous injuries, which were typically caused by suture needles. They sustained a mean of 4.1 exposures to blood and body fluid, predominantly from blood splashes to the eyes. Fewer than half of the respondents reported completion of hepatitis B vaccination, and postexposure prophylaxis for human immunodeficiency virus was widely available. Surgeons reported using hands-free passing and blunt suture needles. Non-fluid-resistant cotton gowns and masks were the barrier garments worn most frequently.
American Journal of Infection Control | 2012
Jane Perry; Janine Jagger; Ginger Parker; Elayne Kornblatt Phillips; Ahmed Gomaa
This study examined the feasibility of using routinely collected information on patients enrolled in home health care to predict their subsequent use of services. Data were gathered from 1,984 episodes of care randomly sampled from home health care agencies of the Virginia Health Department. Age, sex, Medicare and Medicaid enrollment, referral source, medical diagnosis, and prognosis were used to predict the total number of visits, the duration of enrollment, and the intensity of service. Since the data were originally gathered to study the effects of the implementation of diagnosis-related groups (DRGs) on home health services, half of the patients were enrolled before and half after the implementation of DRGs. Using multiple linear regression analysis, significant amounts of variance in each measure of home health care utilization were explained by the predictor variables (R2= 0.04 to 0.10). For example, after controlling for other predictor variables, age 75 years or older predicted longer durations of enrollment and lower intensities of service as compared with other age groups(P < 0.05),and four of 14 diagnosis categories predicted at least one measure of utilization (P < 0.05). Medicaid enrollment predicted longer durations of enrollment and lower intensities of service in home health care (P < 0.05) in the post–DRG but not the pre–DRG period. These results demonstrate the value of routinely collected information in predicting the use of home health services. To develop more accurate estimates of needs for home health services for particular groups of patients, additional information on chronic functional impairments, informal caregiving, and the chronicity of needs may be useful.
Journal of Infection and Public Health | 2012
Elayne Kornblatt Phillips; Owen Simwale; Matthew J. Chung; Ginger Parker; Jane Perry; Janine Jagger
BACKGROUND The operating room is a high-risk setting for occupational sharps injuries and bloodborne pathogen exposure. The requirement to provide safety-engineered devices, mandated by the Needlestick Safety and Prevention Act of 2000, has received scant attention in surgical settings. STUDY DESIGN We analyzed percutaneous injury surveillance data from 87 hospitals in the United States from 1993 through 2006, comparing injury rates in surgical and nonsurgical settings before and after passage of the law. We identified devices and circumstances associated with injuries among surgical team members. RESULTS Of 31,324 total sharps injuries, 7,186 were to surgical personnel. After the legislation, injury rates in nonsurgical settings dropped 31.6%, but increased 6.5% in surgical settings. Most injuries were caused by suture needles (43.4%), scalpel blades (17%), and syringes (12%). Three-quarters of injuries occurred during use or passing of devices. Surgeons and residents were most often original users of the injury-causing devices; nurses and surgical technicians were typically injured by devices originally used by others. CONCLUSIONS Despite legislation and advances in sharps safety technology, surgical injuries continued to increase during the period that nonsurgical injuries decreased significantly. Hospitals should comply with requirements for the adoption of safer surgical technologies, and promote policies and practices shown to substantially reduce blood exposures to surgeons, their coworkers, and patients. Although decisions affecting the safety of the surgical team lie primarily in the surgeons hands, there are also roles for administrators, educators, and policy makers.
The Lancet | 2008
Janine Jagger; Ahmed Gomaa; Elayne Kornblatt Phillips
BACKGROUND To gauge the impact of regulatory-driven improvements in sharps disposal practices in the United States over the last 2 decades, we analyzed percutaneous injury (PI) data from a national surveillance network from 2 periods, 1993-1994 and 2006-2007, to see whether changes in disposal-related injury patterns could be detected. METHODS Data were derived from the EPINet Sharps Injury Surveillance Research Group, established in 1993 and coordinated by the International Healthcare Worker Safety Center at the University of Virginia. For the period 1993-1994, 69 hospitals contributed data; the combined average daily census for the 2 years was 24,495, and the total number of PIs reported was 7,854. For the period 2006-2007, 33 hospitals contributed data; the combined average daily census was 6,800, and the total number of PIs reported was 1901. RESULTS In 1992-1993, 36.8% of PIs reported were related to disposal of sharp devices. In 2006-2007, this proportion was 19.3%, a 53% decline. CONCLUSIONS This comparison provides evidence that implementation of point-of-use, puncture-resistant sharps disposal containers, combined with large-scale use of safety-engineered sharp devices, has resulted in a marked decline in sharps disposal-related injury rates in the United States. The protocol for removing and replacing full sharps disposal containers remains a critical part of disposal safety.