Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jean Ethier is active.

Publication


Featured researches published by Jean Ethier.


American Journal of Kidney Diseases | 1999

Left ventricular mass index increase in early renal disease: Impact of decline in hemoglobin

Adeera Levin; Christopher R. Thompson; Jean Ethier; Euan Carlisle; Sheldon W. Tobe; David C. Mendelssohn; Ellen Burgess; Kailash Jindal; Brendan J. Barrett; Joel Singer; Ognjenka Djurdjev

Cardiovascular disease occurs in patients with progressive renal disease both before and after the initiation of dialysis. Left ventricular hypertrophy (LVH) is an independent predictor of morbidity and mortality in dialysis populations and is common in the renal insufficiency population. LVH is associated with numerous modifiable risk factors, but little is known about LV growth (LVG) in mild-to-moderate renal insufficiency. This prospective multicenter Canadian cohort study identifies factors associated with LVG, measured using two-dimensional-targeted M-mode echocardiography. Eight centers enrolled 446 patients, 318 of whom had protocol-mandated clinical, laboratory, and echocardiographic measurements recorded. We report 246 patients with assessable echocardiograms at both baseline and 12 months with an overall prevalence of LVH of 36%. LV mass index (LVMI) increased significantly (>20% of baseline or >20 g/m2) from baseline to 12 months in 25% of the population. Other than baseline LVMI, no differences in baseline variables were noted between patients with and without LVG. However, there were significant differences in decline of Hgb level (-0.854 v -0.108 g/dL; P = 0.0001) and change in systolic blood pressure (+6.50 v -1.09 mm Hg; P = 0.03) between the groups with and without LVG. Multivariate analysis showed the independent contribution of decrease in Hgb level (odds ratio [OR], 1.32 for each 0.5-g/dL decrease; P = 0.004), increase in systolic blood pressure (OR, 1.11 for each 5-mm Hg increase; P = 0.01), and lower baseline LVMI (OR, 0.85 for each 10-g/m2; P = 0.011) in predicting LVG. Thus, after adjusting for baseline LVMI, Hgb level and systolic blood pressure remain independently important predictors of LVG. We defined the important modifiable risk factors. There remains a critical need to establish optimal therapeutic strategies and targets to improve clinical outcomes.


Nephrology Dialysis Transplantation | 2008

Vascular access use and outcomes: an international perspective from the dialysis outcomes and practice patterns study

Jean Ethier; David C. Mendelssohn; Stacey J. Elder; Takeshi Hasegawa; Tadao Akizawa; Takashi Akiba; Bernard Canaud; Ronald L. Pisoni

Background. A well-functioning vascular access (VA) is essential to efficient dialysis therapy. Guidelines have been implemented improving care, yet access use varies widely across countries and VA complications remain a problem. This study took advantage of the unique opportunity to utilize data from the Dialysis Outcomes and Practice Patterns Study (DOPPS) to examine international trends in VA use and trends in patient characteristics and practices associated with VA use from 1996 to 2007. DOPPS is a prospective, observational study of haemodialysis (HD) practices and patient outcomes at >300 HD units from 12 countries and has collected data thus far from >35 000 randomly selected patients. Methods. VA data were collected for each patient at study entry (1996–2007). Practice pattern data from the facility medical director, nurse manager and VA surgeon were also analysed. Results. Since 2005, a native arteriovenous fistula (AVF) was used by 67–91% of prevalent patients in Japan, Italy, Germany, France, Spain, the UK, Australia and New Zealand, and 50–59% in Belgium, Sweden and Canada. From 1996 to 2007, AVF use rose from 24% to 47% in the USA but declined in Italy, Germany and Spain. Moreover, graft use fell by 50% in the USA from 58% use in 1996 to 28% by 2007. Across three phases of data collection, patients consistently were less likely to use an AVF versus other VA types if female, of older age, having greater body mass index, diabetes, peripheral vascular disease or recurrent cellulitis/gangrene. In addition, countries with a greater prevalence of diabetes in HD patients had a significantly lower percentage of patients using an AVF. Despite poorer outcomes for central vein catheters, catheter use rose 1.5- to 3-fold among prevalent patients in many countries from 1996 to 2007, even among non-diabetic patients 18–70 years old. Furthermore, 58–73% of patients new to end-stage renal disease (ESRD) used a catheter for the initiation of HD in five countries despite 60–79% of patients having been seen by a nephrologist >4 months prior to ESRD. Patients were significantly (P < 0.05) less likely to start dialysis with a permanent VA if treated in a faciity that (1) had a longer time from referral to access surgery evaluation or from evaluation to access creation and (2) had longer time from access creation until first AVF cannulation. The median time from referral until access creation varied from 5–6 days in Italy, Japan and Germany to 40–43 days in the UK and Canada. Compared to patients using an AVF, patients with a catheter displayed significantly lower mean Kt/V levels. Conclusions. Most countries meet the contemporary National Kidney Foundations Kidney Disease Outcomes Quality Initiative goal for AVF use; however, there is still a wide variation in VA preference. Delays between the creation and cannulation must be improved to enhance the chances of a future permanent VA. Native arteriovenous fistula is the VA of choice ensuring dialysis adequacy and better patient outcomes. Graft is, however, a better alternative than catheter for patients where the creation of an attempted AVF failed or could not be created for different reasons.


American Journal of Kidney Diseases | 2009

Facility Hemodialysis Vascular Access Use and Mortality in Countries Participating in DOPPS: An Instrumental Variable Analysis

Ronald L. Pisoni; Charlotte J. Arrington; Justin M. Albert; Jean Ethier; Naoki Kimata; Mahesh Krishnan; Hugh Rayner; Akira Saito; Jeffrey J. Sands; Rajiv Saran; Brenda W. Gillespie; Robert A. Wolfe; Friedrich K. Port

BACKGROUND Previously, the Dialysis Outcomes and Practice Patterns Study (DOPPS) has shown large international variations in vascular access practice. Greater mortality risks have been seen for hemodialysis (HD) patients dialyzing with a catheter or graft versus a native arteriovenous fistula (AVF). To further understand the relationship between vascular access practice and outcomes, we have applied practice-based analyses (using an instrumental variable approach) to decrease the treatment-by-indication bias of prior patient-level analyses. STUDY DESIGN A prospective observational study of HD practices. SETTING & PARTICIPANTS Data collected from 1996 to 2004 from 28,196 HD patients from more than 300 dialysis units participating in the DOPPS in 12 countries. PREDICTOR OR FACTOR Patient-level or case-mix-adjusted facility-level vascular access use. OUTCOMES/MEASUREMENTS: Mortality and hospitalization risks. RESULTS After adjusting for demographics, comorbid conditions, and laboratory values, greater mortality risk was seen for patients using a catheter (relative risk, 1.32; 95% confidence interval, 1.22 to 1.42; P < 0.001) or graft (relative risk, 1.15; 95% confidence interval, 1.06 to 1.25; P < 0.001) versus an AVF. Every 20% greater case-mix-adjusted catheter use within a facility was associated with 20% greater mortality risk (versus facility AVF use, P < 0.001); and every 20% greater facility graft use was associated with 9% greater mortality risk (P < 0.001). Greater facility catheter and graft use were both associated with greater all-cause and infection-related hospitalization. Catheter and graft use were greater in the United States than in Japan and many European countries. More than half the 36% to 43% greater case-mix-adjusted mortality risk for HD patients in the United States versus the 5 European countries from the DOPPS I and II was attributable to differences in vascular access practice, even after adjusting for other HD practices. Vascular access practice differences accounted for nearly 30% of the greater US mortality compared with Japan. LIMITATIONS Possible existence of unmeasured facility- and patient-level confounders that could impact the relationship of vascular access use with outcomes. CONCLUSIONS Facility-based analyses diminish treatment-by-indication bias and suggest that less catheter and graft use improves patient survival.


American Journal of Kidney Diseases | 1990

The transtubular potassium concentration in patients with hypokalemia and hyperkalemia.

Jean Ethier; Kamel S. Kamel; Peter Magner; Jacob Lemann; Mitchell L. Halperin

It is advantageous to make an independent assessment of the potassium (K) secretory process and the luminal flow rate in the renal cortex to evaluate K handling by the kidney during hypokalemia or hyperkalemia. The transtubular potassium concentration gradient (TTKG) is a semiquantitative index of the activity of the K secretory process. The purpose of this study was to define expected values for the TTKG in normal subjects with hypokalemia or following an acute K load. During hypokalemia of non-renal origin, the TTKG was 0.9 +/- 0.2; in contrast, the TTKG was significantly higher during the hypokalemia of hyperaldosteronism, 6.7 +/- 1.3. The TTKG was 11.8 +/- 3.6, 2 hours after normokalemic subjects received 0.2 mg 9 alpha-fludrocortisone (9 alpha-F). To obtain expected values during hyperkalemia, normal subjects ingested 50 mmol potassium chloride; 2 hours later, the TTKG was 13.1 +/- 3.8. Therefore, the expected value for the TTKG must be interpreted relative to the concentration of K in the plasma. Circumstances were also defined where the TTKG is low despite hyperaldosteronism, namely, during a water diuresis and pre-existing hypokalemia.


Annals of Surgery | 2008

Enhanced training in vascular access creation predicts arteriovenous fistula placement and patency in hemodialysis patients: results from the Dialysis Outcomes and Practice Patterns Study.

Rajiv Saran; Stacey J. Elder; David A. Goodkin; Takashi Akiba; Jean Ethier; Hugh Rayner; Akira Saito; Eric W. Young; Brenda W. Gillespie; Robert M. Merion; Ronald L. Pisoni

Objective:To investigate whether intensity of surgical training influences type of vascular access placed and fistula survival. Summary Background Data:Wide variations in fistula placement and survival occur internationally. Underlying explanations are not well understood. Methods:Prospective data from 12 countries in the Dialysis Outcomes and Practice Patterns Study were analyzed; outcomes of interest were type of vascular access in use (fistula vs. graft) in hemodialysis patients at study entry and time from placement until primary and secondary access failures, as predicted by surgical training. Logistic and Cox regression models were adjusted for patient characteristics and time on hemodialysis. Results:During training, US surgeons created fewer fistulae (US mean = 16 vs. 39–426 in other countries) and noted less emphasis on vascular access placement compared with surgeons elsewhere. Significant predictors of fistula versus graft placement in hemodialysis patients included number of fistulae placed during training (adjusted odds ratio [AOR] = 2.2 for fistula placement, per 2 times greater number of fistulae placed during training, P < 0.0001) and degree of emphasis on vascular access creation during training (AOR = 2.4 for fistula placement, for much-to-extreme emphasis vs. no emphasis, P = 0.0008). Risk of primary fistula failure was 34% lower (relative risk = 0.66, P = 0.002) when placed by surgeons who created ≥25 (vs. <25) fistulae during training. Conclusions:Surgical training is key to both fistula placement and survival, yet US surgical programs seem to place less emphasis on fistula creation than those in other countries. Enhancing surgical training in fistula creation would help meet targets of the Fistula First Initiative.


Clinical Journal of The American Society of Nephrology | 2012

Modifiable Practices Associated with Sudden Death among Hemodialysis Patients in the Dialysis Outcomes and Practice Patterns Study

Michel Jadoul; Jyothi Thumma; Douglas S. Fuller; Francesca Tentori; Yun Li; Hal Morgenstern; David C. Mendelssohn; Tadashi Tomo; Jean Ethier; Friedrich K. Port; Bruce M. Robinson

BACKGROUND AND OBJECTIVES Sudden death is common in hemodialysis patients, but whether modifiable practices affect the risk of sudden death remains unclear. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS This study analyzed 37,765 participants in 12 countries in the Dialysis Outcomes and Practice Patterns Study to explore the association of the following practices with sudden death (due to cardiac arrhythmia, cardiac arrest, and/or hyperkalemia): treatment time [TT] <210 minutes, Kt/V <1.2, ultrafiltration volume >5.7% of postdialysis weight, low dialysate potassium [K(D) <3]), and prescription of Q wave/T wave interval-prolonging drugs. Cox regression was used to estimate effects on mortality, adjusting for potential confounders. An instrumental variable approach was used to further control for unmeasured patient-level confounding. RESULTS There were 9046 deaths, 26% of which were sudden (crude mortality rate, 15.3/100 patient-years; median follow-up, 1.59 years). Associations with sudden death included hazard ratios of 1.13 for short TT, 1.15 for large ultrafiltration volume, and 1.10 for low Kt/V. Compared with K(D) ≥3 mEq/L, the sudden death rate was higher for K(D) ≤1.5 and K(D)=2-2.5 mEq/L. The instrumental variable approach yielded generally consistent findings. The sudden death rate was elevated for patients taking amiodarone, but not other Q wave/T wave interval-prolonging drugs. CONCLUSIONS This study identified modifiable dialysis practices associated with higher risk of sudden death, including short TT, large ultrafiltration volume, and low K(D). Because K(D) <3 mEq/L is common and easy to change, K(D) tailoring may prevent some sudden deaths. This hypothesis merits testing in clinical trials.


Journal of The American Society of Nephrology | 2010

Reactive Oxygen Species Promote Caspase-12 Expression and Tubular Apoptosis in Diabetic Nephropathy

Marie-Luise Brezniceanu; Cara J. Lau; Nicolas Godin; Isabelle Chenier; Alain Duclos; Jean Ethier; János G. Filep; Julie R. Ingelfinger; Shao-Ling Zhang; John S.D. Chan

Apoptosis of tubular epithelial cells contributes to the tubular atrophy that accompanies diabetic nephropathy. Reactive oxygen species (ROS) promote tubular apoptosis, but the mechanisms by which this occurs are incompletely understood. Here, we sought proapoptotic genes that ROS differentially upregulate in renal proximal tubular cells of diabetic (db/db) mice. We performed microarray analysis using total RNA from freshly isolated renal proximal tubules of nondiabetic, diabetic, and diabetic transgenic mice overexpressing catalase in the proximal tubule (thereby attenuating ROS). We observed greater expression of caspase-12 in the proximal tubules of the diabetic mice compared with the nondiabetic and diabetic transgenic mice. Quantitative PCR and immunohistochemistry confirmed the enhanced expression of caspase-12, as well as members of the endoplasmic reticulum stress-induced apoptotic pathway. Ex vivo, albumin induced caspase-12 activity and expression (protein and mRNA) and mRNA expression of the CCAT/enhancer-binding protein homologous protein in freshly isolated wild-type proximal tubules but not in catalase-overexpressing proximal tubules. In vitro, albumin stimulated activity of both caspase-12 and caspase-3 as well as expression of caspase-12 and CCAT/enhancer-binding protein homologous protein in a human proximal tubule cell line (HK-2). The free radical scavenger tiron inhibited these effects. Furthermore, knockdown of caspase-12 with small interfering RNA reduced albumin-induced apoptosis in HK-2 cells. Taken together, these studies demonstrate that albuminuria may induce tubular apoptosis through generation of ROS and the subsequent expression and activation of endoplasmic reticulum stress genes in the diabetic kidney.


Burns | 2000

Veno-venous continuous renal replacement therapy for burned patients with acute renal failure.

Richard E. Tremblay; Jean Ethier; Serge Quérin; Vincent Béroniade; Pierre Falardeau; Martine Leblanc

From 1995 to 1998, 12 burned patients with acute renal failure (ARF) were treated by veno-venous continuous renal replacement therapy (CRRT) at the Burn Unit of Hôtel-Dieu de Montréal. Their mean (+/-SD) age was 51+/-12 years, and the mean burned surface covered 48.6+/-15.8% of total body surface area. All patients were mechanically ventilated and presented evidence of sepsis. The mean delay before occurrence of ARF was 15+/-6 days and ARF was mainly related to sepsis and hypotension. Main reasons for CRRT initiation were azotemia and fluid overload. A total of 15 CRRT modalities were applied (12 continuous veno-venous hemodiafiltration, CVVHDF; two continuous veno-venous hemofiltration, CVVH; and one continuous veno-venous hemodialysis, CVVHD) over 14+/-13 days. For CRRT, nine patients received heparin and three were not anticoagulated. Mean values for dialysate and reinjection flow rates were 1134+/-250 ml/h and 635+/-327 ml/h, respectively. Admission weight was 78.8+/-12.7 kg with a mean weight gain before CRRT initiation of 10.0+/-5.8 kg and a mean weight loss during CRRT of 8.9+/-5.5 kg. Nine patients received enteral plus parenteral nutrition, and three, parenteral nutrition only; the total caloric intake was 31.5+/-7.0 kcal/kg/day and protein intake, 1.8+/-0.4 g/kg/day. The normalized protein catabolic rate (nPCR) was evaluated at 2.28+/-0.78 g/kg/day during CRRT. The mortality rate was 50%. The six survivors all recovered normal renal function with four of them requiring intermittent hemodialysis for short periods. In conclusion, veno-venous CRRT is particularly well suited for this selected population allowing smooth fluid removal and aggressive nutritional support.


Nephron Clinical Practice | 2013

Vascular Access Care and Treatment Practices Associated with Outcomes of Arteriovenous Fistula: International Comparisons from the Dialysis Outcomes and Practice Patterns Study

Manabu Asano; Jyothi Thumma; Kenichi Oguchi; Ronald L. Pisoni; Tadao Akizawa; Takashi Akiba; Shunichi Fukuhara; Kiyoshi Kurokawa; Jean Ethier; Rajiv Saran; Akira Saito

Background: Vascular access (VA) guidelines recommend the native arteriovenous fistula (AVF) as VA of first choice for chronic hemodialysis patients. AVF management is important in hemodialysis patient care. AVF survival is associated with various physical factors, but the effects of dialysis treatment factors upon AVF survival are still not clear. Methods: Study patients were treated at 498 dialysis facilities participating in the Dialysis Outcomes and Practice Patterns Study (DOPPS) 2 or 3 (2002-2007). Analyses included 1,183 incident hemodialysis patients (on dialysis ≤7 days and using an AVF at study entry) and 949 prevalent patients (on dialysis >7 days at DOPPS entry and using a new AVF created during study observation). AVF survival was modeled from the study entry date for incident patients and date of first AVF use for prevalent patients. Predictors of primary and final AVF survival were compared across Japan, North America and Europe/Australia/New Zealand (EUR/ANZ) with adjustments for patient characteristics. Results: No meaningful relationship was seen between AVF survival and various physician and staff practices. However, patients with prior catheter use displayed higher rates of primary and final AVF failure. Final AVF failure rates were higher in facilities with higher median blood flow rates (BFR). They were also greater in North America and EUR/ANZ than in Japan, but this difference was substantially attenuated after accounting for regional differences in facility median BFR. Conclusion: AVF longevity differed according to the DOPPS region, and was related to prior patient catheter use and facility BFR practice. Further longitudinal studies may help demonstrate meaningful associations between VA-handling skill and patency.


Clinical Biochemistry | 1990

The excretion of ammonium lons and acid base balance

Mitchell L. Halperin; Jean Ethier; Kamel S. Kamel

The role of the kidney in acid base balance is to generate “new” bicarbonate ions, largely as a result of the excretion of ammonium ions. Three points will be covered in this review. First, we challenge the traditional view that the proximal nephron reclaims filtered bicarbonate ions, whereas, the distal nephron generates “new” bicarbonate ions. Virtually all “new” bicarbonate ions are generated in the proximal convoluted tubule during glutamine metabolism; very little is formed at distal sites. Second, the excretion of ammonium ions plays an important role in acid base balance only during chronic ketoacidosis, in response to diarrhea, in chronic renal insufficiency, and in distal renal tubular acidosis. Third, although the excretion of ammonium ions is said to signal the addition of bicarbonate ions to the extracellular fluid, the anion excreted with the ammonium cation is also important for acid base balance.

Collaboration


Dive into the Jean Ethier's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Takashi Akiba

Tokyo Medical and Dental University

View shared research outputs
Top Co-Authors

Avatar

Michel Jadoul

Cliniques Universitaires Saint-Luc

View shared research outputs
Researchain Logo
Decentralizing Knowledge