Greg Grandits
University of Minnesota
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Greg Grandits.
Circulation | 1995
Philip R. Liebson; Greg Grandits; Sinda Dianzumba; Ronald J. Prineas; Richard H. Grimm; James D. Neaton; Jeremiah Stamler
BACKGROUND Increased left ventricular mass (LVM) by echocardiography is associated with increased risk of cardiovascular disease. Thus, it is of interest to compare the effects of both pharmacological and nonpharmacological approaches to the treatment of hypertension on reduction of LVM. METHODS AND RESULTS Changes in LV structure were assessed by M-mode echocardiograms in a double-blind, placebo-controlled clinical trial of 844 mild hypertensive participants randomized to nutritional-hygienic (NH) intervention plus placebo or NH plus one of five classes of antihypertensive agents: (1) diuretic (chlorthalidone), (2) beta-blocker (acebutolol), (3) alpha-antagonist (doxazosin mesylate), (4) calcium antagonist (amlodipine maleate), or (5) angiotensin-converting enzyme inhibitor (enalapril maleate). Echocardiograms were performed at baseline, at 3 months, and annually for 4 years. Changes in blood pressure averaged 16/12 mm Hg in the active treatment groups and 9/9 mm Hg in the NH only group. All groups showed significant decreases (10% to 15%) in LVM from baseline that appeared at 3 months and continued for 48 months. The chlorthalidone group experienced the greatest decrease at each follow-up visit (average decrease, 34 g), although the differences from other groups were modest (average decrease among 5 other groups, 24 to 27 g). Participants randomized to NH intervention only had mean changes in LVM similar to those in the participants randomized to NH intervention plus pharmacological treatment. The greatest difference between groups was seen at 12 months, with mean decreases ranging from 35 g (chlorthalidone group) to 17 g (acebutolol group) (P = .001 comparing all groups). Within-group analysis showed that changes in weight, urinary sodium excretion, and systolic BP were moderately correlated with changes in LVM, being statistically significant in most analyses. CONCLUSIONS NH intervention with emphasis on weight loss and reduction of dietary sodium is as effective as NH intervention plus pharmacological treatment in reducing echocardiographically determined LVM, despite a smaller decrease in blood pressure in the NH intervention only group. A possible exception is that the addition of diuretic (chlorthalidone) may have a modest additional effect on reducing LVM.
JAMA Internal Medicine | 2008
Eswar Krishnan; Kenneth H. Svendsen; James D. Neaton; Greg Grandits; Lewis H. Kuller
BACKGROUND There are limited data available on the association of gouty arthritis (gout) in middle age with long-term cardiovascular disease (CVD) mortality. METHODS We performed a 17-year follow-up study of 9105 men, aged 41 to 63 years and at above-average risk for coronary heart disease, who were randomized to the Multiple Risk Factor Intervention Trial and who did not die or have clinical or electrocardiographic evidence of coronary artery disease during the 6-year trial. Risk of CVD death and other causes subsequent to the sixth annual examination associated with gout was assessed by means of Cox proportional hazards regressions. RESULTS The unadjusted mortality rates from CVD among those with and without gout were 10.3 per 1000 person-years and 8.0 per 1000 person-years, respectively, representing an approximately 30% greater risk. After adjustment for traditional risk factors, use of diuretics and aspirin, and serum creatinine level, the hazard ratio (gout vs no gout) for coronary heart disease mortality was 1.35 (95% confidence interval [CI], 1.06-1.72). The hazard ratio for death from myocardial infarction was 1.35 (95% CI, 0.94-1.93); for death from CVD overall, 1.21 (95% CI, 0.99-1.49); and for death from any cause, 1.09 (95% CI, 1.00-1.19) (P = .04). The association between hyperuricemia and CVD was weak and did not persist when analysis was limited to men with hyperuricemia without a diagnosis of gout. CONCLUSION Among middle-aged men, a diagnosis of gout accompanied by an elevated uric acid level imparts significant independent CVD mortality risk. TRIAL REGISTRATION clinicaltrials.gov Identifier: NCT00000487.
Journal of The American Society of Nephrology | 2006
Areef Ishani; Greg Grandits; Richard H. Grimm; Kenneth H. Svendsen; Allan J. Collins; Ronald J. Prineas; James D. Neaton
The incidence of ESRD is increasing rapidly. Limited information exists regarding early markers for the development of ESRD. This study aimed to determine over 25 yr the risk for ESRD associated with proteinuria, estimated GFR (eGFR), and hematocrit in men who did not have identified kidney disease and were randomly assigned into the Multiple Risk Factor Intervention Study (MRFIT). A total of 12,866 men who were at high risk for heart disease were enrolled (1973 to 1975) and followed through 1999. Renal replacement therapy was ascertained by matching identifiers with the United States Renal Data Systems data; vital status was from the National Death Index. Men who initiated renal replacement therapy or died as a result of kidney disease were deemed to have developed ESRD. Dipstick urine for proteinuria, eGFR, and hematocrit were related to development of ESRD. During 25 yr, 213 (1.7%) men developed ESRD. Predictors of ESRD were dipstick proteinuria of 1+ or > or =2+ (hazard ratio [HR] 3.1 [95% confidence interval (CI) 1.8 to 5.4] and 15.7 [95% CI 10.3 to 23.9] respectively) and an eGFR of <60 ml/min per 1.73 m(2) (HR 2.4; 95% CI 1.5 to 3.8). Correlation between eGFR and serum creatinine was 0.9; the risk for ESRD with a 1-SD difference of each was identical (HR 1.21). Bivariate analysis demonstrated a 41-fold increase in ESRD risk in those with an eGFR <60 ml/min per 1.73 m(2) and > or =2+ proteinuria (95% CI 15.2 to 71.1). There was no association between hematocrit and ESRD. Other baseline measures that independently predicted ESRD included age, cigarette smoking, BP, low HDL cholesterol, and fasting glucose. Among middle-aged men who were at high risk for cardiovascular disease but had no clinical evidence of cardiovascular disease or significant kidney disease, dipstick proteinuria and an eGFR value <60 ml/min per 1.73 m(2) were strong predictors of long-term development of ESRD. It remains unknown whether intervention for proteinuria or early identification of those with chronic kidney disease reduces the risk for ESRD.
Circulation | 1993
Philip R. Liebson; Greg Grandits; Ronald J. Prineas; Sinda Dianzumba; John M. Flack; Jeffrey A. Cutler; Richard A. Grimm; Jeremiah Stamler
BackgroundEchocardiography provides a noninvasive means of assessing left ventricular (LV) structure and evidence of LV wall remodeling in hypertensive persons. The relation of demographic, biological, and other factors with LV structure can be assessed. Methods and ResultsLV structure was assessed by M-mode echocardiograms for 511 men and 333 women with mild hypertension (average blood pressure, 140/91 mm Hg). Measurements ofLV wall thicknesses and internal dimensions were made, and estimates of LV mass indexes and other derivations of structure were calculated. LV hypertrophy criteria were based on previously reported echocardiographic population studies of normal subjects. These measures were compared by age, sex, race, body mass index, systolic blood pressure, antihypertensive drug use, physical activity, alcohol intake, cigarette smoking, and urinary sodium excretion. Despite virtual absence of ECG-determined LV hypertrophy, 13% of men and 20%o ofwomen had echocardiographically determined LV hypertrophy indexed by body surface area (g/m2), and 24% of men and 45% of women had LV hypertrophy indexed by height (g/m). Black participants had slightly higher mean levels of wall thickness than nonblack participants but similar LV mass. Systolic blood pressure and urinary sodium excretion were significantly and independently associated with LV mass index and LV hypertrophy using both g/m2 and g/m. Body mass index was significantly related to LV mass index and LV hypertrophy using g/m. Smoking was significantly associated with LV mass index, i.e., using continuous measurement but not using the dichotomy for LV hypertrophy. ConclusionThis study of a large population of men and women with mild primary hypertension, largely without ECG evidence of LV hypertrophy, showed a substantial percentage of participants with echocardiographically determined LV hypertrophy. LV mass indexes correlated positively with systolic blood pressure, body mass index, urinary sodium excretion, and smoking.
Circulation | 1996
Jeremiah Stamler; Arlene W. Caggiula; Greg Grandits; Marcus O. Kjelsberg; Jeffrey A. Cutler
BACKGROUND Elevated blood pressure remains a widespread major impediment to health. Obesity and specific dietary factors such as high salt and alcohol intake and low potassium intake adversely affect blood pressure. It is a reasonable hypothesis that additional dietary constituents, particularly macronutrients, may also influence blood pressure. METHODS AND RESULTS Participants were 11,342 middle-aged men from the Multiple Risk Factor Intervention Trial (MRFIT). Data from repeat 24-hour dietary recalls (four to five per person) and blood pressure measurements at six annual visits were used to assess relationships, singly and in combination, of dietary macronutrients to blood pressure, adjusted for multiple possible confounders (demographic, dietary, and biomedical). Multiple linear regression was used to assess diet-blood pressure relations in two MRFIT treatment groups (special intervention and usual care), with adjustment for confounders, pooling of coefficients from the two groups (weighted by inverse of variance), and correction of coefficients for regression-dilution bias. In multivariate regression models, dietary cholesterol (milligrams per 1000 kilocalories), saturated fatty acids (percent of kilocalories), and starch (percent of kilocalories) were positively related to blood pressure; protein and the ratio of dietary polyunsaturated to saturated fatty acids were inversely related to blood pressure. These macronutrient-blood pressure findings were obtained in analyses that controlled for body mass, dietary sodium and ratio of sodium to potassium, and alcohol intake, each positively related to blood pressure, and intake of potassium and caffeine, both inversely related to blood pressure. CONCLUSIONS These data support the concept that multiple dietary factors influence blood pressure; hence, broad improvements in nutrition can be important in preventing and controlling high-normal and high blood pressure.
Journal of Trauma-injury Infection and Critical Care | 2011
David R. Tribble; Nicholas G. Conger; Susan Fraser; Todd Gleeson; Ken Wilkins; Tanya Antonille; Amy C. Weintrob; Anuradha Ganesan; Lakisha J. Gaskins; Ping Li; Greg Grandits; Michael L. Landrum; Duane R. Hospenthal; Eugene V. Millar; Lorne H. Blackbourne; James R. Dunne; David Craft; Katrin Mende; Glenn W. Wortmann; Rachel K. Herlihy; Jay R. McDonald; Clinton K. Murray
Infections have long been known to complicate care in patients following traumatic injury frequently leading to excess morbidity and mortality.1, 2 In no setting is this more well-recognized than the challenging environment of combat casualty care. During the current military conflicts in Iraq and Afghanistan, Operations Iraqi and Enduring Freedom (OIF/OEF), major advances resulting in increased survival among wounded personnel have been observed. These include enhanced training of medics, forward deployment of surgical assets, rapid medical evacuation, and improvements in body armor.3–5 The significant advances leading to survival are coupled with major challenges in care due to the extensive nature of the injuries, profound bone and soft tissue disruption, and extensive wound contamination.6, 7 In addition, the rapid transit of these patients through multiple echelons of medical care places significant obstacles on infection control in an era of increasing risk due to hospital-associated multidrug resistant (MDRO) organisms.8, 9 The U.S. Department of Defense (DoD) has implemented a range of measures to improve combat casualty care and mitigate risk of infectious complications. A Joint Theater Trauma System and Joint Theater Trauma Registry (JTTR) have been developed to benchmark metrics and to provide a timely assessment of performance improvement interventions.5, 10, 11 Efforts to prevent infection include the development of guidelines for the prevention of infection related to combat injuries through comprehensive review of current evidence and consensus review by military and civilian experts in trauma, infectious disease, infection control, preventive medicine, and surgical specialties.12 In addition, standardized infection control measures across echelons of care accompanied by enhanced MDRO surveillance and serial evaluation have also been implemented.13, 14 Despite the growing literature describing infectious complications of combat-related trauma, there is still a lack of prospectively collected standardized infection data that includes specific therapy, microbiological findings and clinical outcomes across treatment facilities. This report describes the initial data and current status of an ongoing 5-year prospective observational cohort study of infectious complications associated with traumatic injury sustained during deployment, the DoD-Department of Veterans Affairs (VA) Trauma Infectious Disease Outcomes Study (TIDOS).
Journal of Acquired Immune Deficiency Syndromes | 2010
Nancy F. Crum-Cianflone; Greg Grandits; Sara Echols; Anuradha Ganesan; Michael L. Landrum; Amy C. Weintrob; Robert V Barthel; Brian K. Agan
Background:Declining rates of hospitalizations occurred shortly after the availability of highly active antiretroviral therapy (HAART). However, trends in the late HAART era are less defined, and data on the impact of CD4 counts and HAART use on hospitalizations are needed. Methods:We evaluated hospitalization rates from 1999 to 2007 among HIV-infected persons enrolled in a large US military cohort. Poisson regression was used to compare hospitalization rates per year and to examine factors associated with hospitalization. Results:Of the 2429 participants, 822 (34%) were hospitalized at least once with 1770 separate hospital admissions. The rate of hospitalizations (137 per 1000 person-years) was constant over the study period [relative rate (RR) 1.00 per year change, 95% confidence interval: 0.98 to 1.02]. The hospitalization rates due to skin infections (RR: 1.50, P = 0.02), methicillin-resistant staphylococcus aureus (RR: 3.19, P = 0.03), liver disease (RR: 1.71, P = 0.04), and surgery (RR: 1.17, P = 0.04) significantly increased over time, whereas psychological causes (RR: 0.60, P < 0.01) and trauma (RR: 0.54, P < 0.01) decreased. In the multivariate model, higher nadir CD4 (RR: 0.92 per 50 cells, P < 0.01) and higher proximal CD4 counts (RR of 0.71 for 350-499 vs. <350 cells/mm3 and RR 0.67 for ≥500 vs. <350 cells/mm3, both P < 0.01) were associated with lower risk of hospitalization. Risk of hospitalization was constant for proximal CD4 levels above 350 (RR: 0.94 P = 0.51, CD4 ≥500 vs. 350-499). HAART was associated with a reduced risk of hospitalization among those with a CD4 <350 (RR: 0.72, P = 0.02) but had smaller estimated and nonsignificant effects at higher CD4 levels (RR: 0.81, P = 0.33 and 1.06, P = 0.71 for CD4 350-499 and ≥500, respectively). Conclusions:Hospitalizations continue to occur at high rates among HIV-infected persons with increasing rates for skin infections, methicillin-resistant staphylococcus aureus, liver disease, and surgeries. Factors associated with a reduced risk of hospitalization include CD4 counts >350 cells per cubic millimeter and HAART use among patients with a CD4 count <350 cells per cubic millimeter.
Journal of the American College of Cardiology | 1996
Peter M. Okin; Greg Grandits; Pentti M. Rautaharju; Ronald J. Prineas; Jerome D. Cohen; Richard S. Crow; Paul Kligfield
OBJECTIVES We sought to assess the effect of heart rate adjustment of ST segment depression on risk stratification for the prediction of death from coronary artery disease. BACKGROUND Standard analysis of the ST segment response to exercise based on a fixed magnitude of horizontal or downsloping ST segment depression has demonstrated only limited diagnostic sensitivity for the detection of coronary artery disease and has variable test performance in predicting coronary artery disease mortality. Heart rate adjustment of the magnitude of ST segment depression has been proposed as an alternative approach to increase the diagnostic and prognostic accuracy of the exercise electrocardiogram (ECG). METHODS Exercise ECGs were performed in 5,940 men from the Usual Care Group of the Multiple Risk Factor Intervention Trial at entry into the study. An abnormal ST segment response to exercise was defined according to standard criteria as > or = 100 micro V of additional horizontal or downsloping ST segment depression at peak exercise. The ST segment/heart rate index was calculated by dividing the change in ST segment depression from rest to peak exercise by the exercise-induced change in heart rate. An abnormal ST segment/heart rate index was defined as >1.60 micro V/beats per min. RESULTS After a mean follow-up of 7 years there were 109 coronary artery disease deaths. Using a Cox proportional hazards model, a positive exercise ECG by standard criteria was not predictive of coronary mortality (age-adjusted relative risk [RR] 1.5, 95% confidence interval [CI] 0.6 to 3.6, p = 0.39). In contrast, an abnormal ST segment/heart rate index significantly increased the risk of death from coronary artery disease (age-adjusted RR 4.1, 95% CI 2.7 to 6.0, p < 0.0001). Excess risk of death was confined to the highest quintile of ST segment/heart rate index values, and within this quintile, risk was directly related to the magnitude of test abnormality. After multivariate adjustment for age, diastolic blood pressure, serum cholesterol and cigarettes smoked per day, the ST segment/heart rate index remained a significant independent predictor of coronary death (RR 3.6, 95% CI 2.4 to 5.4, p < 0.001). CONCLUSIONS Simple heart rate adjustment of the magnitude of ST segment depression improves the prediction of death from coronary artery disease in relatively high risk, asymptomatic men. These findings strongly support the use of heart rate-adjusted indexes of ST segment depression to improve the predictive value of the exercise ECG.
Journal of Acquired Immune Deficiency Syndromes | 2009
Amy C. Weintrob; Greg Grandits; Brian K. Agan; Anuradha Ganesan; Michael L. Landrum; Nancy F. Crum-Cianflone; Erica Johnson; Claudia E. Ordóñez; Glenn W. Wortmann; Vincent C. Marconi
Objective:Studies comparing virologic response to highly active antiretroviral therapy (HAART) between African Americans (AA) and European Americans (EA) have been confounded by differences in duration of HIV infection and access to health care. We evaluated virologic response to HAART between ethnicities in a large cohort with fewer confounders. Methods:The odds of attaining viral suppression at 6- and 12-months post-HAART were determined by multivariate logistic regression for HIV-infected AA and EA prospectively followed in a large US military cohort. Time-to-event methods were used to compare maintenance of suppression. Results:A total of 1363 subjects (51% AA, 92% men) with viral load results available 6 months after HAART initiation were included. There was no difference between ethnicities in time from seroconversion to HIV diagnosis or HAART initiation or in HAART regimens. Adjusted for multiple demographic and HIV-related factors, AA had significantly lower odds of obtaining undetectable viral loads after 6 (odds ratio 0.6, 95% confidence interval 0.4-0.8, P < 0.001) and 12 months (odds ratio 0.6, 95% confidence interval 0.4-0.8, P = 0.002) of HAART. Once undetectable, there was no difference in time to virologic failure between AA and EA. Conclusions:Despite similar durations of HIV infection and equal access to health care, AAs were significantly less likely to achieve viral suppression compared with EA.
Infection Control and Hospital Epidemiology | 2010
Timothy J. Whitman; Rachel K. Herlihy; Carey D. Schlett; Patrick R. Murray; Greg Grandits; Anuradha Ganesan; Maya Brown; James D. Mancuso; William B. Adams; David R. Tribble
BACKGROUND Community-associated methicillin-resistant Staphylococcus aureus (CA-MRSA) causes skin and soft-tissue infection (SSTI) in military recruits. OBJECTIVE To evaluate the effectiveness of 2% chlorhexidine gluconate (CHG)-impregnated cloths in reducing rates of SSTI and S. aureus colonization among military recruits. DESIGN A cluster-randomized (by platoon), double-blind, controlled effectiveness trial. SETTING Marine Officer Candidate School, Quantico, Virginia, 2007. PARTICIPANTS Military recruits. INTERVENTION Application of CHG-impregnated or control (Comfort Bath; Sage) cloths applied over entire body thrice weekly. MEASUREMENTS Recruits were monitored daily for SSTI. Baseline and serial nasal and/or axillary swabs were collected to assess S. aureus colonization. RESULTS Of 1,562 subjects enrolled, 781 (from 23 platoons) underwent CHG-impregnated cloth application and 781 (from 21 platoons) underwent control cloth application. The rate of compliance (defined as application of 50% or more of wipes) at 2 weeks was similar (CHG group, 63%; control group, 67%) and decreased over the 6-week period. The mean 6-week SSTI rate in the CHG-impregnated cloth group was 0.094, compared with 0.071 in the control group (analysis of variance model rate difference, 0.025 ± 0.016; P = .14). At baseline, 43% of subjects were colonized with methicillin-susceptible S. aureus (MSSA), and 2.1% were colonized with MRSA. The mean incidence of colonization with MSSA was 50% and 61% (P = .026) and with MRSA was 2.6% and 6.0% (P = .034) for the CHG-impregnated and control cloth groups, respectively. CONCLUSIONS CHG-impregnated cloths applied thrice weekly did not reduce rates of SSTI among recruits. S. aureus colonization rates increased in both groups but to a lesser extent in those assigned to the CHG-impregnated cloth intervention. Antecedent S. aureus colonization was not a risk factor for SSTI. Additional studies are needed to identify effective measures for preventing SSTI among military recruits. CLINICAL TRIALS REGISTRATION ClinicalTrials.gov identifier: NCT00475930.