Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sinee Disthabanchong is active.

Publication


Featured researches published by Sinee Disthabanchong.


Nephrology Dialysis Transplantation | 2010

Sodium thiosulfate delays the progression of coronary artery calcification in haemodialysis patients

Surawat Adirekkiat; V. Sumethkul; Atiporn Ingsathit; Somnuek Domrongkitchaiporn; Bunyong Phakdeekitcharoen; Surasak Kantachuvesiri; Chagriya Kitiyakara; Pinkaew Klyprayong; Sinee Disthabanchong

BACKGROUND Coronary artery calcification (CAC) is prevalent among haemodialysis patients and predicts cardiovascular mortality. In addition to modifying traditional cardiovascular risk factors, therapy aimed at lowering serum phosphate and calcium-phosphate product has been advocated. Sodium thiosulfate, through its chelating property, removes calcium from precipitated minerals decreasing calcification burden in calcific uraemic arteriolopathy and soft tissue calcification. The effect of sodium thiosulfate on CAC in haemodialysis patients has never been studied. METHODS Eighty-seven stable chronic haemodialysis patients underwent multi-row spiral computed tomography and bone mineral density (BMD) measurement. Patients with a CAC score >or=300 were included to receive intravenous sodium thiosulfate infusion twice weekly post-haemodialysis for 4 months. CAC and BMD were re-evaluated at the end of the treatment course. RESULTS Progression of CAC occurred in 25% and 63% of the patients in the treatment and control group, respectively (P = 0.03). CAC score was unchanged in the treatment group but increased significantly in the control group. BMD of the total hip declined significantly in the treatment group. In multivariate analysis adjusted for factors that influenced CAC progression, therapy with sodium thiosulfate was an independent protective factor (odds ratio = 0.05, P = 0.04). Major side effects were persistent anorexia and metabolic acidosis. CONCLUSIONS The effect of sodium thiosulfate in delaying the progression of CAC is encouraging and will require a larger study. Determination of the safe therapeutic window is necessary in order to avoid bone demineralization.


American Journal of Kidney Diseases | 2015

Serum Phosphorus and Progression of CKD and Mortality: A Meta-analysis of Cohort Studies

Jingjing Da; Xinfang Xie; Myles Wolf; Sinee Disthabanchong; Jinwei Wang; Yan Zha; Jicheng Lv; Zhang L; Haiyan Wang

BACKGROUND Recent studies have indicated that phosphorus may play an independent pathogenic role in chronic kidney disease (CKD) progression, but some of those studies were underpowered and yielded inconsistent results. STUDY DESIGN Systematic review and meta-analysis. SETTING & POPULATION Non-dialysis-dependent patients with CKD (transplant recipients were excluded). SELECTION CRITERIA FOR STUDIES Studies assessing the risk ratio of serum phosphorus level on kidney failure and mortality for non-dialysis-dependent patients with CKD published from January 1950 to June 2014 were included following systematic searching of MEDLINE, EMBASE, and the Cochrane Library. PREDICTOR Serum phosphorus level. OUTCOME Kidney failure, defined as doubled serum creatinine level, 50% decline in estimated glomerular filtration rate, or end-stage kidney disease. RESULTS In 12 cohort studies with 25,546 patients, 1,442 (8.8%) developed kidney failure and 3,089 (13.6%) died. Overall, every 1-mg/dL increase in serum phosphorus level was associated independently with increased risk of kidney failure (hazard ratio, 1.36; 95% CI, 1.20-1.55) and mortality (hazard ratio, 1.20; 95% CI, 1.05-1.37). LIMITATIONS Existence of potential residual confounding could not be excluded. CONCLUSIONS This meta-analysis suggests an independent association between serum phosphorus level and kidney failure and mortality among non-dialysis-dependent patients with CKD and suggests that large-scale randomized controlled trials should target disordered phosphorus homeostasis in CKD.


BMC Nephrology | 2013

Mineral metabolism and outcomes in chronic kidney disease stage 2–4 patients

Kamonwan Chartsrisak; Kotcharat Vipattawat; Montira Assanatham; Arkom Nongnuch; Atiporn Ingsathit; Somnuek Domrongkitchaiporn; V. Sumethkul; Sinee Disthabanchong

BackgroundMarked hyperphosphatemia, hyperparathyroidism and 25-hydroxyvitamin D deficiency are associated with mortality in dialysis patients. Such data in chronic kidney disease stage 2–4 population are limited. It has been suggested that high-normal serum phosphate predicts worse renal and patient outcomes. The data regarding parathyroid hormone and outcomes in this population is limited. The present study examined mineral metabolism and its association with the development of end-stage renal disease and mortality in stage 2–4 chronic kidney disease patients.MethodsThis is a prospective cohort study that included 466 non-dialysis chronic kidney disease stage 2–4 patients. Mineral parameters were obtained at the time of enrollment and the patients were followed prospectively for 25 (1–44) months or until they reached the endpoints of end-stage renal disease or mortality.ResultsHyperparathyroidism and 25-hydroxyvitamin D deficiency began to occur in the early stages of chronic kidney disease, whereas significant hyperphosphatemia only developed in the later stages. High-normal and mildly elevated serum phosphate (>4.2 mg/dL) predicted the composite outcome of end-stage renal disease or mortality after adjustments for cardiovascular risk factors, chronic kidney disease stage and other mineral parameters. Parathyroid hormone levels above the upper limit of normal (>65 pg/mL) predicted the future development of end-stage renal disease and the composite outcome of end-stage renal disease or mortality after adjustments. 25-hydroxyvitamin D deficiency (<15 ng/mL) was also associated with worse outcomes.ConclusionsIn chronic kidney disease, hyperparathyroidism developed prior to significant hyperphosphatemia confirming the presence phosphate retention early in the course of chronic kidney disease. High-normal serum phosphate and mildly elevated parathyroid hormone levels predicted worse renal and patient outcomes. This data emphasizes the need for early intervention in the care of chronic kidney disease stage 2–4 patients.


Clinical Journal of The American Society of Nephrology | 2012

Renal Phosphate Loss in Long-Term Kidney Transplantation

Supinda Sirilak; Kamonwan Chatsrisak; Atiporn Ingsathit; Surasak Kantachuvesiri; V. Sumethkul; Wasana Stitchantrakul; Piyanuch Radinahamed; Sinee Disthabanchong

BACKGROUND AND OBJECTIVES Renal phosphate wasting occurs early postkidney transplantation as a result of an accumulation of parathyroid hormone and fibroblast growth factor 23 from the CKD period. Serum phosphate, parathyroid hormone, and fibroblast growth factor 23 return to baseline 1 year postkidney transplantation. What happens beyond this period is unknown. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS Mineral parameters were obtained from 229 kidney transplant recipients at least 1 year posttransplantation; 46 normal subjects and 202 CKD patients with similar GFR served as controls. Factors associated with phosphate metabolism were analyzed. RESULTS Despite the reduced graft function, most kidney transplant recipients had lower serum phosphate than normal subjects accompanied by renal phosphate loss. Fibroblast growth factor 23 was mostly lower or comparable with normal subjects, whereas parathyroid hormone was elevated in most patients. Hyperparathyroidism is also more common among kidney transplant recipients compared with CKD patients. Both parathyroid hormone and fibroblast growth factor 23 showed relationships with renal phosphate excretion, but only parathyroid hormone displayed an independent association. Parathyroid hormone showed the highest area under the curve in predicting renal phosphate leak. When patients were categorized according to parathyroid hormone and fibroblast growth factor 23 levels, only subset of patients with high parathyroid hormone had an increased renal phosphate excretion. CONCLUSIONS Relatively low serum phosphate from renal phosphate leak continued to present in long-term kidney transplantation. Both parathyroid hormone and fibroblast growth factor 23 participated in renal tubular phosphate handling, but persistent hyperparathyroidism seemed to have a greater influence in this setting.


American Journal of Nephrology | 2010

Oral Sodium Bicarbonate Improves Thyroid Function in Predialysis Chronic Kidney Disease

Sinee Disthabanchong; Akarapong Treeruttanawanich

Background/Aims: Metabolic acidosis (MA) in chronic kidney disease (CKD) associates with protein energy malnutrition, osteoporosis, abnormal endocrine function and increased mortality. Oral sodium bicarbonate has been shown to improve nutritional status and preserve renal function in CKD. Depressed thyroid function has been described in CKD and was believed to be related to MA. This is a prospective randomized study that examined the effect of oral sodium bicarbonate on thyroid function in predialysis CKD with MA. Methods: Predialysis CKD patients with serum total CO2 ≤22 mM were randomized into two groups. The treatment group received increasing dose of oral sodium bicarbonate until serum total CO2 was ≧24 mM. Control patients were kept on the same medications. Thyroid function tests were measured at baseline and again after 8–12 weeks. Results: All patients had a glomerular filtration rate <35 ml/min/1.73 m2. Serum total CO2 increased significantly in the treatment group and was unchanged in the control group. At baseline, over half of the patients had T3 below the lower limit of normal. At study completion, free T3 declined further in the control group, whereas free T3, total T3, free T4 and TSH rose significantly in the treatment group. Percentage changes of total CO2 from baseline were strongly associated with the changes of T3 parameters. Glomerular filtration rate was maintained in the treatment group but declined significantly in the control group. Conclusion: Oral sodium bicarbonate, through correction of MA, improved thyroid function in predialysis CKD.


Transplantation Proceedings | 2012

Impact of Early Ureteric Stent Removal and Cost-Benefit Analysis in Kidney Transplant Recipients: Results of a Randomized Controlled Study

W. Parapiboon; Atiporn Ingsathit; Sinee Disthabanchong; A. Nongnuch; A. Jearanaipreprem; C. Charoenthanakit; S. Jirasiritham; V. Sumethkul

INTRODUCTION Duration of retaining ureteric stent in kidney transplantation is still controversial. Our study aimed to compare healthcare expenditures in kidney transplant recipients with early or routine ureteric stent removal. METHODS This study was a post hoc analysis of data from a single-center parallel randomized controlled open-label study. Ninety patients who underwent kidney transplantation at a university-based hospital in Thailand from April 2010 to January 2011 were enrolled. Patients were randomized to early ureteric stent removal (8 days) or routine ureteric stent removal (15 days) after kidney transplantation. The costs of direct health care associated with kidney transplantation, urologic complication, and urinary tract infection (UTI) within the postoperative period among the 2 groups were compared. RESULTS Seventy-four patients (58% living donor) fulfilled the randomized criteria (early removal, n = 37; routine removal, n = 37). By intention-to-treat analysis, incidence of UTI in early stent removal was less than the routine stent removal group (15/37, 40.5% vs 27/37, 72.9%; P = .004). Urologic complication showed no significant difference between the early and routine groups (4/37 vs 2/37; P = .39). The cost-benefit analysis of early over routine stent removal was 2390 United States dollars (USD) per patient (11,182 vs 8792 USD). Presence of UTI significantly increase the hospitalization cost of 5131 USD per patient (mean cost = 12,209 vs 7078 USD; P < .001). CONCLUSION UTI in the early post-kidney transplantation period increases healthcare cost. Early ureteric stent removal can reduce UTI and reduce hospitalization cost. This approach shows cost-benefit in the early management of kidney transplant recipients.


Nephrology Dialysis Transplantation | 2011

Metabolic acidosis lowers circulating adiponectin through inhibition of adiponectin gene transcription

Sinee Disthabanchong; Kannika Niticharoenpong; Piyanuch Radinahamed; Wasana Stitchantrakul; Boonsong Ongphiphadhanakul; Suradej Hongeng

BACKGROUND Metabolic acidosis (MA) adversely affects protein and lipid metabolism as well as endocrine function. Adipose tissue communicates with the rest of the body through synthesis and release adipokines, such as leptin, adiponectin and TNF-alpha. Adiponectin enhances insulin sensitivity and possesses anti-atherogenic and anti-inflammatory properties. Circulating adiponectin correlates inversely with cardiovascular events. It is possible that MA negatively regulates adiponectin contributing to poor patient outcome. The present study investigates the effect of MA on adiponectin in vivo and in vitro. METHODS Twenty healthy female volunteers underwent a 7-day course of oral ammonium chloride (NH4Cl)-induced acidosis. Serum adiponectin was determined before and after NH4Cl ingestion. Adipocytes were differentiated from their precursors, human mesenchymal stem cells (hMSCs), in culture. Concentrated HCl was added to the media to lower pH. Adiponectin mRNA and protein were determined at 48 and 96 h by real-time RT-PCR and ELISA, respectively. RESULTS After a 7-day course of NH4Cl, serum bicarbonate decreased significantly associated with the increase in urine ammonium and titratable acid. Adiponectin decreased significantly from 10,623 to 9723 pg/mL (P<0.005). MA suppressed adiponectin mRNA in hMSC-derived adipocytes at 48 and 96 h (P<0.01). The amount of adiponectin released into the culture media declined corresponding to the mRNA levels (P<0.001). MA did not affect adipocyte triglyceride or protein content. CONCLUSIONS MA lowered circulating adiponectin through inhibition of adiponectin gene transcription in adipocytes.


World journal of transplantation | 2015

Mineral and bone disorder after kidney transplantation

Pahnwat T. Taweesedt; Sinee Disthabanchong

After successful kidney transplantation, accumulated waste products and electrolytes are excreted and regulatory hormones return to normal levels. Despite the improvement in mineral metabolites and mineral regulating hormones after kidney transplantation, abnormal bone and mineral metabolism continues to present in most patients. During the first 3 mo, fibroblast growth factor-23 (FGF-23) and parathyroid hormone levels decrease rapidly in association with an increase in 1,25-dihydroxyvitamin D production. Renal phosphate excretion resumes and serum calcium, if elevated before, returns toward normal levels. FGF-23 excess during the first 3-12 mo results in exaggerated renal phosphate loss and hypophosphatemia occurs in some patients. After 1 year, FGF-23 and serum phosphate return to normal levels but persistent hyperparathyroidism remains in some patients. The progression of vascular calcification also attenuates. High dose corticosteroid and persistent hyperparathyroidism are the most important factors influencing abnormal bone and mineral metabolism in long-term kidney transplant (KT) recipients. Bone loss occurs at a highest rate during the first 6-12 mo after transplantation. Measurement of bone mineral density is recommended in patients with estimated glomerular filtration rate > 30 mL/min. The use of active vitamin D with or without bisphosphonate is effective in preventing early post-transplant bone loss. Steroid withdrawal regimen is also beneficial in preservation of bone mass in long-term. Calcimimetic is an alternative therapy to parathyroidectomy in KT recipients with persistent hyperparathyroidism. If parathyroidectomy is required, subtotal to near total parathyroidectomy is recommended. Performing parathyroidectomy during the waiting period prior to transplantation is also preferred in patients with severe hyperparathyroidism associated with hypercalcemia.


Blood Purification | 2014

Low Hip Bone Mineral Density Predicts Mortality in Maintenance Hemodialysis Patients: A Five-Year Follow-Up Study

Sinee Disthabanchong; Sutipong Jongjirasiri; Surawat Adirekkiat; V. Sumethkul; Atiporn Ingsathit; Somnuek Domrongkitchaiporn; Bunyong Phakdeekitcharoen; Surasak Kantachuvesiri; Chagriya Kitiyakara

Background: Bone loss is common among hemodialysis patients and contributes to mortality. The association between bone loss and vascular calcification may explain the increased mortality risk. Studies on the association between decreased bone mass and mortality in maintenance hemodialysis patients are limited. Methods: Eighty-three hemodialysis patients underwent bone mineral density (BMD) and coronary artery calcification (CAC) measurements. The relationship between BMD and mortality was analyzed after a 5-year follow-up period. Results: Eighty percent of the patients had reduced hip BMD. In univariate Cox regression analyses, age, cardiovascular disease, dyslipidemia, increased CAC score, increased comorbidity score and decreased hip BMD were associated with mortality. Low hip BMD remained independently associated with mortality after adjustments for cardiovascular risk factors, comorbidity score and CAC score. Patients with BMD in the lowest tertile had the worst survival. Conclusion: Low hip BMD predicted mortality in maintenance hemodialysis patients independent of CAC.


International Journal of Molecular Sciences | 2010

Proteomic Profiles of Mesenchymal Stem Cells Induced by a Liver Differentiation Protocol

Kawin Leelawat; Siriluck Narong; Suthidarak Chaijan; Khanit Sa-ngiamsuntorn; Sinee Disthabanchong; Adisak Wongkajornsilp; Suradej Hongeng

The replacement of disease hepatocytes and the stimulation of endogenous or exogenous regeneration by human mesenchymal stem cells (MSCs) are promising candidates for liver-directed cell therapy. In this study, we isolated MSCs from adult bone marrow by plastic adhesion and induced differentiation with a liver differentiation protocol. Western blot analyses were used to assess the expression of liver-specific markers. Next, MSC-specific proteins were analyzed with two-dimensional (2D) gel electrophoresis and peptide mass fingerprinting matrix-assisted laser desorption/ionization (MALDI)-time of flight (TOF)-mass spectrometry (MS). To confirm the results from the proteomic study, semi-quantitative reverse transcription-polymerase chain reaction (RT-PCR) analyses were performed. We demonstrated that MSCs treated with the liver differentiation protocol expressed significantly more albumin, CK19 and CK20, than did undifferentiated cells. In addition the results of proteomic study demonstrated increases expression of FEM1B, PSMC2 and disulfide-isomerase A3 in MSCs treated with the liver differentiation protocol. These results from proteomic profiling will not only provide insight into the global responses of MSCs to hepatocyte differentiation, but will also lead to in-depth studies on the mechanisms of proteomic changes in MSCs.

Collaboration


Dive into the Sinee Disthabanchong's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge