Jenny E. Kootstra-Ros
University Medical Center Groningen
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jenny E. Kootstra-Ros.
Hypertension | 2013
Michel M. Joosten; Ron T. Gansevoort; Kenneth J. Mukamal; Jenny E. Kootstra-Ros; Edith J. M. Feskens; Johanna M. Geleijnse; Gerjan Navis; Stephan J. L. Bakker
Observational studies on dietary or circulating magnesium and risk of hypertension have reported weak-to-modest inverse associations, but have lacked measures of actual dietary uptake. Urinary magnesium excretion, an indicator of intestinal magnesium absorption, may provide a better insight in this association. We examined 5511 participants aged 28 to 75 years free of hypertension in the Prevention of Renal and Vascular End-Stage Disease (PREVEND) study, a prospective population–based cohort study. Circulating magnesium was measured in plasma and urinary magnesium in two 24-hour urine collections, both at baseline. Incident hypertension was defined as blood pressure ≥140 mm Hg systolic or ≥90 mm Hg diastolic, or initiation of antihypertensive medication. During a median follow-up of 7.6 years (interquartile range, 5.0–9.3 years), 1172 participants developed hypertension. The median urinary magnesium excretion was 3.8 mmol/24 hour (interquartile range, 2.9–4.8 mmol/24 hour). Urinary magnesium excretion was associated with risk of hypertension in an inverse log-linear fashion, and this association remained after adjustment for age, sex, body mass index, smoking status, alcohol intake, parental history of hypertension, and urinary excretion of sodium, potassium, and calcium. Each 1-unit increment in ln-transformed urinary magnesium excretion was associated with a 21% lower risk of hypertension after multivariable adjustment (adjusted hazard ratio, 0.79; 95% confidence interval, 0.71–0.88). No associations were observed between circulating magnesium and risk of hypertension. In conclusion, in this cohort of men and women, urinary magnesium excretion was inversely associated with risk of hypertension across the entire range of habitual dietary intake.Observational studies on dietary or circulating magnesium and risk of hypertension have reported weak-to-modest inverse associations, but have lacked measures of actual dietary uptake. Urinary magnesium excretion, an indicator of intestinal magnesium absorption, may provide a better insight in this association. We examined 5511 participants aged 28 to 75 years free of hypertension in the Prevention of Renal and Vascular End-Stage Disease (PREVEND) study, a prospective population–based cohort study. Circulating magnesium was measured in plasma and urinary magnesium in two 24-hour urine collections, both at baseline. Incident hypertension was defined as blood pressure ≥140 mm Hg systolic or ≥90 mm Hg diastolic, or initiation of antihypertensive medication. During a median follow-up of 7.6 years (interquartile range, 5.0–9.3 years), 1172 participants developed hypertension. The median urinary magnesium excretion was 3.8 mmol/24 hour (interquartile range, 2.9–4.8 mmol/24 hour). Urinary magnesium excretion was associated with risk of hypertension in an inverse log-linear fashion, and this association remained after adjustment for age, sex, body mass index, smoking status, alcohol intake, parental history of hypertension, and urinary excretion of sodium, potassium, and calcium. Each 1-unit increment in ln-transformed urinary magnesium excretion was associated with a 21% lower risk of hypertension after multivariable adjustment (adjusted hazard ratio, 0.79; 95% confidence interval, 0.71–0.88). No associations were observed between circulating magnesium and risk of hypertension. In conclusion, in this cohort of men and women, urinary magnesium excretion was inversely associated with risk of hypertension across the entire range of habitual dietary intake. # Novelty and Significance {#article-title-42}
Atherosclerosis | 2015
Setor K. Kunutsor; Stephan J. L. Bakker; Jenny E. Kootstra-Ros; Hans Blokzijl; Ron T. Gansevoort; Robin P. F. Dullaart
BACKGROUND Alanine aminotransferase (ALT) and aspartate aminotransferase (AST) have been linked with an increased risk of type 2 diabetes, but their relationships with cardiovascular disease (CVD) are uncertain. We aimed to assess the associations of ALT and AST with CVD risk and determine their potential utility for CVD risk prediction. METHODS ALT and AST measurements were made at baseline in the PREVEND prospective cohort involving 6899 participants aged 28-75 years without pre-existing CVD. RESULTS During 10.5 years of follow-up, 729 CVD events were recorded. Serum aminotransferases were strongly correlated with each other and each weakly correlated with several cardiovascular risk markers. ALT and AST were each approximately log-linearly associated with CVD risk. In analyses adjusted for conventional risk factors, the hazard ratios (95% CIs) for CVD per 1 standard deviation increase in loge ALT and loge AST were 0.87 (0.79-0.94; P = 0.001) and 0.91 (0.84-0.98; P = 0.017) respectively. The associations remained consistent after additional adjustment for several potential confounders including alcohol consumption, fasting glucose, and C-reactive protein, with corresponding hazard ratios of 0.88 (0.80-0.96; P = 0.003) and 0.92 (0.84-0.99; P = 0.029). The inverse associations persisted within normal ranges of the aminotransferases. Adding ALT or AST to a CVD risk prediction model containing established risk factors did not improve the C-index or net reclassification. CONCLUSIONS Available data suggest the liver aminotransferases are each inversely, independently, and approximately log-linearly associated with CVD risk. Nonetheless, they provide no significant improvement in CVD risk assessment beyond conventional CVD risk factors.
European Journal of Heart Failure | 2017
IJsbrand T. Klip; Adriaan A. Voors; Dorine W. Swinkels; Stephan J. L. Bakker; Jenny E. Kootstra-Ros; Carolyn S.P. Lam; Pim van der Harst; Dirk J. van Veldhuisen; Peter van der Meer
Heart failure (HF) is a common manifestation of patients with primary and secondary causes of iron overload, whereas in patients with established HF iron deficiency impairs outcome. Whether iron stores, either depleted or in overload, amplify the risk for new‐onset HF among healthy individuals is unknown. The present study aimed to assess whether markers of iron status or the iron‐regulatory hormone hepcidin are associated with new‐onset HF or cardiovascular (CV) events in the general population.
Atherosclerosis | 2015
Ineke J. Riphagen; S. J. J. Logtenberg; Klaas H. Groenier; Kornelis J. J. van Hateren; Gijs W. D. Landman; Joachim Struck; Gerjan Navis; Jenny E. Kootstra-Ros; Ido P. Kema; Henk J. G. Bilo; Nanne Kleefstra; Stephan J. L. Bakker
BACKGROUND AND AIMS Hyponatremia has been associated with an increased mortality risk in the general population. Diabetes is a condition predisposing for elevated levels of arginine vasopressin (AVP) and heart failure, both common causes of hyponatremia. These factors, however, are also associated with an increased mortality risk. We aimed to investigate whether serum sodium is associated with cardiovascular and all-cause mortality in type 2 diabetes and whether these associations could be explained by copeptin, a surrogate for AVP, or NT-proBNP, a marker for heart failure. METHODS Patients with type 2 diabetes participating in the observational ZODIAC study were included. Cox regression analyses were used to investigate the association of serum sodium with mortality. RESULTS We included 1068 patients (age 67 ± 12 years, 45% male, serum sodium 142 ± 3 mmol/L). After 15 years of follow-up, 519 patients (49%) died, with 225 cardiovascular deaths (21%). In univariable analyses, serum sodium, copeptin, and NT-proBNP were all significantly associated with cardiovascular and all-cause mortality. These associations remained significant after combination of these markers in a multivariable model. Serum sodium and NT-proBNP remained significantly associated with mortality after further adjustment for potential confounders, whereas copeptin lost significance after adjustment for SCr and ACR. CONCLUSION Low serum sodium was associated with an increased risk of cardiovascular and all-cause mortality in type 2 diabetes. Moreover, these associations were not explained by copeptin and NT-proBNP. Whether low serum sodium itself leads to poor outcome or is a marker for (unidentified) co-morbidity severity or use of specific medications remains to be elucidated.
Transplant International | 2016
Michele F. Eisenga; Isidor Minovic; Stefan P. Berger; Jenny E. Kootstra-Ros; Else van den Berg; Ineke J. Riphagen; Gerjan Navis; Peter van der Meer; Stephan J. L. Bakker; Carlo A. J. M. Gaillard
Anemia, iron deficiency anemia (IDA), and iron deficiency (ID) are highly prevalent in renal transplant recipients (RTR). Anemia is associated with poor outcome, but the role of ID is unknown. Therefore, we aimed to investigate the association of ID, irrespective of anemia, with all-cause mortality in RTR. Cox regression analyses were used to investigate prospective associations. In 700 RTR, prevalences of anemia, IDA, and ID were 34%, 13%, and 30%, respectively. During follow-up for 3.1 (2.7-3.9) years, 81 (12%) RTR died. In univariable analysis, anemia [HR, 1.72 (95%CI: 1.11-2.66), P = 0.02], IDA [2.44 (1.48-4.01), P < 0.001], and ID [2.04 (1.31-3.16), P = 0.001] were all associated with all-cause mortality. In multivariable analysis, the association of anemia with mortality became weaker after adjustment for ID [1.52 (0.97-2.39), P = 0.07] and disappeared after adjustment for proteinuria and eGFR [1.09 (0.67-1.78), P = 0.73]. The association of IDA with mortality attenuated after adjustment for potential confounders. In contrast, the association of ID with mortality remained independent of potential confounders, including anemia [1.77 (1.13-2.78), P = 0.01]. In conclusion, ID is highly prevalent among RTR and is associated with an increased risk of mortality, independent of anemia. As ID is a modifiable factor, correction of ID could be a target to improve survival.
Transplantation | 2015
Petronella E. Deetman; M. Yusof Said; Daan Kromhout; Robin P. F. Dullaart; Jenny E. Kootstra-Ros; Jan Stephan Sanders; Marc A. Seelen; Rijk O. B. Gans; Gerjan Navis; Michel M. Joosten; Stephan J. L. Bakker
Background Little is known about optimal protein intake after transplantation. The aim of this study was to prospectively investigate associations of urinary urea excretion, a marker for protein intake, with graft failure and mortality in renal transplant recipients (RTR) and potential effect modification by body mass index (BMI) and estimated glomerular filtration rate (eGFR). Methods Urinary urea excretion was measured in repeated 24-hr urine collections between 6 and 18 months after transplantation. Results In total, 940 RTR were included. During 4.4 (2.3–7.8) years of follow-up for graft failure and 4.8 (2.5–8.3) years for all-cause mortality, 78 RTR developed graft failure and 158 RTR died. Urinary urea excretion was not associated with graft failure in the overall population, but was inversely associated with graft failure in RTR with BMI less than 25 kg/m2 (hazard ratio [HR], 0.64 [0.28–1.50] and 0.27 [0.09–0.83] for the second and third tertiles, respectively, P < 0.001), and in RTR with eGFR of 45 mL per min per 1.73 m2 or higher (HR, 0.34 [0.15–0.79], P = 0.015 and HR, 0.31 [0.11–0.86], P = 0.025 for the second and third tertiles, respectively), both independent of potential confounders. Compared to the first tertile, RTR in the second and third tertiles of urinary urea excretion were at a lower risk of all-cause mortality (HR, 0.47 [0.32–0.69]; P < 0.001 and HR, 0.42 [0.26–0.68]; P < 0.001, respectively), independent of potential confounders. Body mass index and eGFR did not influence this association. Conclusion Urinary urea excretion, a marker for protein intake, was inversely related to graft failure in RTR with BMI less than 25 kg/m2 and in RTR with an eGFR of 45 mL per min per 1.73 m2 or higher. In addition, urinary urea excretion was inversely related to mortality.
European Journal of Clinical Investigation | 2015
Petronella E. Deetman; Alaa Alkhalaf; Gijs W. D. Landman; Klaas H. Groenier; Jenny E. Kootstra-Ros; Gerjan Navis; Henk J. G. Bilo; Nanne Kleefstra; Stephan J. L. Bakker
Combined data suggest a bimodal association of alanine aminotransferase (ALT) with mortality in the general population. Little is known about the association of ALT with mortality in patients with type 2 diabetes. We therefore investigated the association of ALT with all‐cause, cardiovascular and noncardiovascular mortality in patients with type 2 diabetes.
Clinical Biochemistry | 2013
E.M. Spithoven; Stephan J. L. Bakker; Jenny E. Kootstra-Ros; P. E. De Jong; Ron T. Gansevoort
BACKGROUND As yet little is known about the effect of delayed separation of whole blood stored at room temperature on the stability of the kidney function markers creatinine and cystatin C. METHODS We used plasma samples of 45 patients with a wide range of creatinine and cystatin C concentration. Samples were sent by post as whole blood, and differences in creatinine and cystatin C concentrations when measured (by enzymatic assay and PETIA, respectively) in plasma separated shortly after blood withdrawal or in plasma obtained after delayed separation at 24, 48 and 72 h. Intra- and inter-assay variability was assessed and total change limit was calculated to assess analyte stability. RESULTS Total change limit was 3.3% for creatinine and 3.9% for cystatin C. In whole blood creatinine and cystatin C remained stable up to 48 h. Delayed separation of whole blood did not induce more variability in measured concentrations of both analytes. Glomerular filtration rate estimated with the CKD-EPI equations showed less than 3 mL/min/1.73 m² difference when using creatinine or cystatin C concentration measured in plasma separated up to 48 h after blood withdrawal compared to plasma separated shortly after blood withdrawal. The new CKD-EPI equation that uses creatinine as well as cystatin C to estimate GFR showed even at 72 h less than 3 mL/min/1.73 m² difference. CONCLUSIONS Creatinine and cystatin C remain stable in whole blood stored at room temperature up to 48 h before separation, and changes in these analytes during this time period do not affect variability and eGFR.
PLOS ONE | 2017
Lyanne M. Kieneker; Michele F. Eisenga; Michel M. Joosten; Rudolf A. de Boer; Ron T. Gansevoort; Jenny E. Kootstra-Ros; Gerjan Navis; Stephan J. L. Bakker
Objective Both hypokalemia and hyperkalemia are associated with disease progression in patients with chronic kidney disease (CKD). It is unclear whether similar associations are present in the general population. Our aim was to examine the association of plasma potassium with risk of developing CKD and the role of diuretics in this association in a population-based cohort. Research design and methods We studied 5,130 subjects free of CKD at baseline of the Prevention of Renal and Vascular End-Stage Disease (PREVEND) study, a prospective, population-based cohort of Dutch men and women aged 28–75 years. Hypokalemia was defined as plasma potassium <3.5 mmol/L, and hyperkalemia as plasma potassium ≥5.0 mmol/L. Risk of CKD was defined as de novo development of eGFR <60 ml/min/1.73m2 and/or albuminuria >30 mg/24h. Results Mean baseline plasma potassium was 4.4±0.3 mmol/L. The prevalences of hypokalemia and hyperkalemia were 0.5% and 3.8%, respectively; 3.0% of the subjects used diuretics. During a median follow-up of 10.3 years (interquartile range: 6.3–11.4 years), 753 subjects developed CKD. The potassium-CKD association was modified by diuretic use (Pinteraction = 0.02). Both hypokalemia without (HR, 7.74, 95% CI, 3.43–17.48) or with diuretic use (HR, 4.32, 95% CI, 1.77–10.51) were associated with an increased CKD risk as compared to plasma potassium 4.0–4.4 mmol/L without diuretic use. Plasma potassium concentrations ≥3.5 mmol/L were associated with an increased CKD risk among subjects using diuretics (Ptrend = 0.01) but not among subjects not using diuretics (Ptrend = 0.74). Conclusion In this population-based cohort, hypokalemia was associated with an increased CKD risk, regardless of diuretic use. In the absence of hypokalemia, plasma potassium was not associated with an increased CKD risk, except among subjects using diuretics.
Scientific Reports | 2017
Isidor Minovic; Michele F. Eisenga; Ineke J. Riphagen; Else van den Berg; Jenny E. Kootstra-Ros; Anne-Roos S. Frenay; Harry van Goor; Gerald Rimbach; Tuba Esatbeyoglu; Andrew P. Levy; Carlo A. J. M. Gaillard; Johanna M. Geleijnse; Manfred Eggersdorfer; Gerjan Navis; Ido P. Kema; Stephan L.J. Bakker
Haptoglobin (Hp) is an acute phase protein that has recently been linked to components of the metabolic syndrome (MetS). We aimed to evaluate Hp as marker of MetS, and to assess its association with long-term outcome in renal transplant recipients (RTR). We measured plasma Hp in a prospective cohort of 699 stable RTR and 149 healthy controls. Median plasma Hp concentration in RTR was 1.4 [interquartile range (IQR), 1.0–1.8] g/L, which was higher compared to 1.1 [0.9–1.4] g/L in controls (P < 0.001). Hp was independently associated with the MetS (β = 0.10) (P = 0.005). During follow-up of 5.4 [4.8–6.1] years, 150 (21%) recipients died, of whom 60 (9%) due to cardiovascular causes, and 83 (12%) RTR developed graft failure. High (≥2.0 g/L) and low (≤0.9 g/L) plasma Hp were associated with increased risk of mortality (HR’s 2.3 [1.3–4.1] and 1.9 [1.0–3.5], resp.), predominantly cardiovascular. The association of high Hp lost significance upon adjustment for inflammation markers (HR 1.5 [0.8–2.7]), while low Hp was independently associated with mortality (HR 2.2 [1.2–4.0]). Hp was not associated with graft failure (P = 0.49). In conclusion, plasma Hp is independently associated with MetS in RTR. Importantly, high and low Hp are associated with increased mortality risk, independent of MetS.