D. Patrzałek
Wrocław Medical University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by D. Patrzałek.
Transplantation | 2007
Dorota Kamińska; Bronislaw Tyran; Oktawia Mazanowska; Jerzy Rabczyński; Piotr Szyber; D. Patrzałek; Paweł Chudoba; Wojciech G. Polak; Marian Klinger
Background. This study focuses on the cytokine genes expression after brain-death, ischemia-reperfusion injury, and during allograft rejection. Methods. A total of 49 needle core biopsies from kidney transplant recipients, performed before and during transplantation procedures were studied. The first biopsy was taken during procurement of the organ, the second after cold ischemia, and the third after approximately 30 min of reperfusion. We also assessed 34 allograft biopsies obtained during acute rejection. Tubular and glomerular expression of interferon (IFN)-&ggr;, transforming growth factor (TGF)-&bgr;1, platelet-desired growth factor-B (PDGF-B), interleukin (IL)-2, IL-6, IL-10 mRNA was analyzed with reverse-transcription polymerase chain reaction (RT-PCR) in situ technique, which allows to detect a few copies of the target gene without destruction of the tissue architecture. Results. Compared with normal kidney tissue from living donor, high gene expression of IFN-&ggr;, TGF-&bgr;1, PDGF-B, IL-2, IL-6, and IL-10 was detected in all procurement specimens. After reperfusion gene expressions of IL-2, IL-6, and IL-10 were significantly upregulated in renal tubules compared to biopsies taken after cold ischemia. The gene expression of IFN-&ggr;, TGF-&bgr;1, and PDGF-B remained stable after organ procurement, during cold ischemia, and after reperfusion. Gene expression of IFN-&ggr;, IL-2, IL-6, IL-10, and PDGF-B in procurement biopsies, as well as in those taken after cold ischemia and reperfusion, were significantly higher than during the period of acute rejection. Conclusion. The data presented herein strongly point out the importance of the immunological and morphological injury that occurs before and during transplantation. The increase of inflammatory response after brain death is important for further stimulation of the immune response and long-term kidney survival.
Transplantation Proceedings | 2003
M. Kuriata-Kordek; M. Boratyńska; K. Falkiewicz; T Porażko; J Urbaniak; M Wozniak; D. Patrzałek; P. Szyber; Marian Klinger
Despite the fact that concentrations of mycophenolic acid (MPA) are not routinely measured, accumulating data suggest the usefulness of this monitoring to optimize therapy. The aim of this study was to assess the influence of CsA and tacrolimus on MPA pharmacokinetics. Concentrations of MPA were measured using HPLC. An assay was performed before dose (the C(0)), as well as at 40 minutes and 1, 2, 4, 6, 8, 10, 12 hours after administration of mycophenolate mofetil (MMF). MPA profiles were assessed in 51 patients receiving tacrolimus at a dose of 1.0 g/d and prednisone as well as in 97 patients receiving CsA (2.0 g/d) and prednisone. Significant correlations of MPA levels with serum albumin and GFR were observed in both groups. Women presented with higher levels of MPA than men. C(0) MPA level among the tacrolimus group were significantly higher than those in CsA group: 3.18 +/- 2.21 microg/mL versus 1.68 +/- 1.03 microg/mL (P </=.001). The level of MPA AUC((0-12)) in the tacrolimus group was nonsignificantly higher than that in the CsA group. There was no second peak of MPA level in a group of patients receiving CsA. We developed a limited sampling strategy to estimate MPA AUC((0-12)) in both tacrolimus and CsA groups. We observed a correlation between C(0) MPA and C(0) CsA (r =.35; P </=.001) as well as, between tacrolimus dose and MPA C(40) and MPA C(max) (r =.24; P </=.05; r =.27; P </= 0.05, respectively). No relationship between MPA pharmacokinetics and tacrolimus blood concentrations was noticed. Tacrolimus and CsA both affect the pharmacokinetics of MPA; high MPA concentrations in patients treated with tacrolimus justify MMF dose reduction in this group. Alterations of CsA concentrations must be used to guide MMF dose adjustments.
Transplantation Proceedings | 2013
M. Banasik; M. Boratyńska; K. Kościelska-Kasprzak; Oktawia Mazanowska; Dorota Bartoszek; M. Żabińska; Marta Myszka; B. Nowakowska; Agnieszka Halon; P. Szyber; D. Patrzałek; Marian Klinger
BACKGROUND Detection of antibody-mediated injury is becoming increasingly important in post-transplant patient care. The role of donor-specific anti-human leukocyte antigen (HLA) antibodies in kidney transplant damage is known, whereas the significance of non-HLA antibodies remains an unresolved concern. The aim of the study was to determine the presence and influence on renal function of non-HLA and anti-HLA antibodies in stable patients at 5 years after kidney transplantation. METHODS We evaluated the antibodies in 35 consecutive patients with stable renal function at 5 years after transplantation. RESULTS Pretransplant screening for donor-specific antibodies by CDC cross-matches was negative in all patients. Anti-endothelial cell antibodies (AECA), anti-angiotensin II type 1 receptor antibodies (anti-AT1R), and anti-endothelin receptor antibodies (anti-ETAR) were assayed as non-HLA antibodies. Non-HLA antibodies were observed in 12 (34%) patients, including AECA (n = 5; 14%), anti- AT1R (n = 6; 17%), anti-ETAR (n = 4; 11%), and both anti-AT1R and anti-ETAR (n = 3). Among 13 (37%) patients with anti-HLA antibodies, 7 also had both non-HLA antibodies: AECA (n = 1), anti-AT1R (n = 3), and anti-ETAR (n = 3). The antibody-negative group (n = 13) showed significantly better renal function than the antibody-positive group (non-HLA and/or anti-HLA; n = 22). Biopsy-proven acute rejection had occurred in 2 of 13 (15%) antibody-negative versus 8 of 22 (36%) antibody-positive patients. These preliminary data revealed an high prevalence of autoantibody and alloantibody production among stable patients at 5 years after kidney transplantation. CONCLUSION Simultaneous production of these antibodies and their association with reduced renal function suggests that active humoral immune responses are poorly controlled by immunosuppression.
Experimental and Toxicologic Pathology | 2010
Jan Magdalan; Alina Ostrowska; Aleksandra Piotrowska; Agnieszka Gomulkiewicz; Marzena Podhorska-Okolow; D. Patrzałek; Adam Szeląg; Piotr Dziegiel
Fatalities due to mushroom poisonings are increasing worldwide, with high mortality rate resulting from ingestion of amanitin-producing species. Intoxications caused by amanitin-containing mushrooms represent an unresolved problem in clinical toxicology since no specific and fully efficient antidote is available. The objective of this study was a comparative evaluation of benzylpenicillin (BPCN), acetylcysteine (ACC) and silibinin (SIL) as an antidotes in human hepatocytes intoxicated with alpha-amanitin (alpha-AMA). All experiments were performed on cultured human hepatocytes. Cytotoxicity evaluation of cultured cells using MTT assay and measurement of lactate dehydrogenase (LDH) activity was performed at 12, 24 and 48h of exposure to alpha-AMA and/or antidotes. The significant decline of cell viability and significant increase of LDH activity were observed in all experimental hepatocyte cultures after 12, 24 and 36h exposure to alpha-AMA at concentration 2microM. Exposure of the cells to alpha-AMA resulted also in significant reduction of cell spreading and attachment. However, addition of tested antidotes to experimental cultures significantly stimulated cell proliferation and attachment. In cell cultures exposed simultaneously to alpha-AMA and tested antidotes cytotoxic parameters (MTT and LDH) were not significantly different from control incidences. The cytoprotective effect of all antidotes was not dose-related, which reflects a high efficacy of all these substances. Administration of studied antidotes was not associated with any adverse effects in hepatocytes. The administration of ACC, BPCN or SIL to human hepatocyte cultures showed a similar strong protective effect against cell damage in alpha-AMA toxicity.
Transplantation Proceedings | 2003
Sławomir Zmonarski; M. Boratyńska; Katarzyna Madziarska; Marian Klinger; M Kusztel; D. Patrzałek; P. Szyber
Estimation of anti-CMV-IgG and anti-CMV-IgM is considered a relatively inexpensive screening tool of CMV status. The aim of study was to estimate how the immunosuppressive protocol influence serum anti-CMV IgG and IgM concentration in renal graft recipients and to estimate the adequacy of anti-CMV-IgG concentration and anti-CMV-IgM index as screening parameters of active CMV disease in patients receiving different immunosuppression. The study group consisted of 33 patients with clinical signs of CMV disease who received one of three types of immunosuppression: (1) azathioprine (Aza) + cyclosporine (CyA) + prednisone (Pr), 20 patients; (2) mycophenolate mofetil (MMF) + CyA + Pr, eight patients; tacrolimus (Tac) + MMF, five patients. Patients were enrolled when the pp65-antigen (pp65) of PBL was positive within 1 to 5 months after transplant (75 patients tested). The IgM-i in the Aza + CyA + Pr group was higher than in MMF + CyA + Pr group (2.73 + 1.8 vs 1.08 +/- 1.07, P =.021). The IgM-i in the Aza + CyA + Pr group was higher than in Tac + MMF (2.73 +/- 1.8 vs 0.78 +/- 0.69; P =.014). There was no difference in IgM-i between MMF + CyA + Pr and Tac + MMF. There was no difference in relative increase of IgG-c among all groups but there was a difference in relative increase of IgM-i between Aza + CyA + Pr and MMF + CyA + Pr groups (6.7 +/- 9.4 vs 2.3 +/- 5.9; P =.007) and between Aza + CyA + Pr and MMF + Tac groups (6.7 +/- 9.4 vs 0.6 +/- 0.54; P =.003). Immunosuppressive protocols including MMF exert an inhibitory influence on B-cell response and synthesis of anti-CMV-IgM. It makes the anti-CMV-IgM index an inadequate rough screening diagnostic parameter of active CMV disease.
Transplantation Proceedings | 2009
K. Falkiewicz; M. Boratyńska; B. Speichert-Bidzińska; M. Magott-Procelewska; Przemysław Biecek; D. Patrzałek; Marian Klinger
OBJECTIVE To assess 1,25-dihydroxyvitamin D status and the effect of vitamin concentration on transplantation outcome in renal allograft recipients. PATIENTS AND METHODS Ninety patients underwent renal transplantation between 2002 and 2005. All received alfacalcidol supplementation before surgery. 1,25-Dihydroxyvitamin D concentration was determined on day 3 posttransplantation and at 1-, 6-, 12-, 18-, and 24-month follow-up. RESULTS Severe 1,25-dihydroxyvitamin D deficiency was noted in 83% of patients immediately posttransplantation. From 1 to 12 months thereafter, concentrations increased almost 3-fold, and remained constant to 24 months. In 50% of patients, the 1,25-dihydroxyvitamin D concentration reached a concentration of more than 30 pg/mL, similar to that in healthy volunteers; in the other 50%, the concentration reached 17.2 pg/mL. A high incidence of delayed graft function was observed in patients with 1,25-dihydroxyvitamin D deficiency (44% vs 6%). There was a negative correlation between the initial 1,25-dihydroxyvitamin D and serum creatinine concentrations at day 3 and month 6 (P < .03). Similarly, the 1,25-dihydroxyvitamin D concentration at 1 month was negatively correlated with creatinine concentration at months 1 through 24 (P < .01). Poor outcome was observed primarily in patients with 1,25-dihydroxyvitamin D deficiency; 2 patients developed cancer, 5 grafts were lost, and 4 patients died of cardiovascular events. CONCLUSIONS 1,25-Dihydroxyvitamin D deficiency is highly prevalent in renal allograft recipients. Patients with 1,25-dihydroxyvitamin D deficiency are at greater risk of delayed graft function, and the graft is more likely to be lost. These findings suggest the necessity of adequate vitamin D supplementation both before and after transplantation.
Nephrology Dialysis Transplantation | 2011
Katarzyna Madziarska; Wacław Weyde; Magdalena Krajewska; D. Patrzałek; Dariusz Janczak; Mariusz Kusztal; Hanna Augustyniak-Bartosik; P. Szyber; Cyprian Kozyra; Marian Klinger
BACKGROUND Post-transplant diabetes mellitus (PTDM) is a common metabolic complication in kidney allograft recipients, significantly contributing to the elevated cardiovascular morbidity after renal transplantation and increased risk of chronic transplant dysfunction. The aim of the present investigation was to evaluate the factors influencing PTDM development. Under particular consideration were the elements, existing before the transplantation, especially the modality of dialysis treatment significance, i.e. haemodialysis (HD) versus peritoneal dialysis (PD). METHODS Three hundred and seventy-seven consecutive outpatients who underwent renal transplantation (RTx) in our institution between January 2003 and December 2005 were analysed. PTDM was diagnosed according to the current American Diabetic Association/World Health Organization criteria. Statistical inference was conducted by means of univariate methods (one factor versus PTDM) and multivariate methods in frames of generalized linear model. RESULTS In the study group, 72 patients (23.4%) developed PTDM after RTx (55 HD and 17 PD patients). PTDM incidence at 3, 6 and 12 months was 15.9%, 22.1% and 23.4%, respectively. The mean interval from transplantation to the onset of PTDM was 3.08 ± 2.73 months. In univariate analysis, the factors associated with the elevated risk of PTDM appearance were older recipient age, positive family history of diabetes, hypertensive nephropathy as end-stage renal disease cause, higher body mass index at transplantation, treatment by PD, and the graft from an older donor. In multivariate verification, statistical significance remained: older recipient age (P < 0.001), positive family history of diabetes (P = 0.002), and treatment by PD (P = 0.007). CONCLUSIONS Treatment by PD appears to be a possible novel factor, not yet reported, which may increase the risk of PTDM development.
Transplantation Proceedings | 2011
Dorota Kamińska; K. Kościelska-Kasprzak; D. Drulis-Fajdasz; Agnieszka Halon; W.G. Polak; P. Chudoba; Dariusz Janczak; Oktawia Mazanowska; D. Patrzałek; Marian Klinger
The results of deceased donor kidney transplantation largely depend on the extent of organ injury induced by brain death and the transplantation procedure. In this study, we analyzed the preprocurement intragraft expression of 29 genes involved in apoptosis, tissue injury, immune cell migration, and activation. We also assessed their influence on allograft function. Before flushing with cold solution we obtained 50 kidney core biopsies of deceased donor kidneys immediately after organ retrieval. The control group included 18 biopsies obtained from living donors. Gene expression was analyzed with low-density arrays (Taqman). LCN2/lipocalin-2 is considered a biomarker of kidney epithelial ischemic injury with a renoprotective function. HAVCR1/KIM-1 is associated with acute tubular injury. Comparison of deceased donor kidneys to control organs revealed a significantly higher expression of LCN2 (8.0-fold P=.0006) and HAVCR1 (4.7-fold, P<.0001). Their expressions positively correlated with serum creatinine concentrations after 6 months after transplantation: LCN2 (r=.65, P<.0001), HAVCR1 (r=.44, P=.006). Kidneys displaying delayed graft function and/or an acute rejection episode in the first 6 months after showed higher LCN2 expression compared to event-free ones (1.7-fold, P=.027). A significantly higher increase in expression of TLR2 (5.2-fold), Interleukin (IL) 18 (4.6-fold), HMGB1 (4.1-fold), GUSB (2.4-fold), CASP3 (2.0-fold) FAS (1.8-fold), and TP53 (1.6-fold) was observed among deceased donor kidneys compared with the control group. Their expression levels were not related to clinical outcomes: however, they showed significant correlations with one another (r>.6, P<.0001). We also observed a slightly reduced expression of IL10 (0.6-fold, P=.004). Our data suggested that increased LCN2 and HAVCR1 expression observed in the kidneys after donor brain death were hallmarks of the organ injury process. LCN2 expression level in retrieved kidneys can predict kidney transplantation outcomes.
Transplantation Proceedings | 2009
J. Jablecki; L. Kaczmarzyk; D. Patrzałek; A. Domanasiewicz; A. Chełmoński
OBJECTIVES The functional outcome after midforearm transplantation (HT) is believed to be similar to the outcome after replantation. However, the few existing reports comparing functional outcomes are based on amputations at the level of the distal forearm. This report provides a comparative analysis of the functional results after midforearm replantation (HR) versus HT. MATERIALS AND METHODS Transplantation of a dominant right forearm performed in a 32-year-old man was compared to the outcomes after five dominant (right) forearm replantations (four men and one woman) in patients ranging from 22 to 38 years of age. Cold ischemia time ranged from 6 to 12.5 hours in all cases. We used similar operative technique and rehabilitation protocol. At 26 (+/-2) months after replantation/transplantation, we recorded, bony union (x-ray), arterial flow (ultrasonography), range of motion, grip strength, sensation (2 PD Weisenstens filaments), quality of life (DASH, 30-150 points), general evaluation of function according to Chens or the IRHCTT scoring system. RESULTS A complication of wound infection was observed in one HR patient; Marginal skin necrosis accompanied by prolonged wound healing, in one HT patient. Unification of bones was achieved faster after forearm replantation when compared with transplantation. Grip strength was 17% greater after replantation, but ranges of motion were comparable in both groups. Sensitivity was superior after forearm transplantation (2 PD 15 mm) and overall patient satisfaction was comparable (90 points of DASH questionnaire for HR versus 108 points for HT patients). None of the patients returned to their previous occupations. CONCLUSION The functional outcome after HT was comparable, and in some respects superior, to the outcome after replantation performed at the midforearm level.
Transplantation Proceedings | 2003
K. Falkiewicz; W. Nahaczewska; M. Boratyńska; H. Owczarek; Marian Klinger; Dorota Kamińska; M Wozniak; T Szepietowski; D. Patrzałek
The aim of the study was to elucidate whether cyclosporine- and tacrolimus-based immunosuppression impairs tubular reabsorption of phosphate after kidney transplantation. Sixty cadaveric allograft recipients were included in the study. Forty patients receiving triple immunosupression with cyclosporine, azathioprine, and prednisone were studied for 1, 6, 12 months (groups A1 and A2, 20 patients) and for 24, 30, and 36 months (groups B1 and B2, 20 patients) after transplantation. Twenty patients who received tacrolimus with steroid withdrawal after 3 months were included in the study (group C). Recipients from groups A2 and B2 were treated additionally with vitamin D and calcium carbonate. Serum iPTH, 25-OHD, 1.25(OH)(2)D concentrations were determined, and TRP (mmol/L) and TmP/GFR (mmol/L) were calculated using Walton-Bijvoet nomogram. Higher values of total calcium serum concentration in group A were detected. Lower inorganic phosphate serum concentrations were detected in groups A and C, in contrast to group B where they remained within normal values. TmP/GFR values were significantly higher in group C in the first and third examination in comparison with patients of group A. Moreover, TRP index values were significantly higher than analogous values of groups A and B. Tacrolimus-treated patients exhibit significantly faster recovery from tubular phosphate reabsorption impairment compared to cyclosporine-treated recipients. No correlation between iPTH, 25-OHD, 1,25(OH)(2)D concentration, and tubular dysfunction parameters was observed. Amelioration of phosphate handling, in spite of hyperparathyroidism intensity, can follow early steroid avoidance.