Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where María José Pérez-Sáez is active.

Publication


Featured researches published by María José Pérez-Sáez.


Transplant Immunology | 2013

Clinical relevance of pretransplant anti-HLA donor-specific antibodies: does C1q-fixation matter?

Marta Crespo; Alberto Torio; Virginia Mas; Dolores Redondo; María José Pérez-Sáez; Marisa Mir; Anna Faura; Rita Guerra; Olga Montes-Ares; María Dolores Checa; Julio Pascual

UNLABELLED Anti-HLA donor-specific antibodies (DSA) identified by single antigen bead array (SAB) are questioned for their excess in sensitivity and lack of event prediction after transplantation. POPULATION AND METHODS We retrospectively evaluated specific types of preformed DSA (class I, class II or C1q-fixing) and their impact on graft survival. Kidney transplantations performed across negative CDC-crossmatch were included (n=355). Anti-HLA antibodies were tested using SAB to identify DSA and their capacity to fix C1q. RESULTS Twenty-eight patients with pretransplant DSA(+) with MFI>2000 were selected to assess C1q fixation. DSA were C1q+ in 15 patients and C1q- in 13, without significant differences in demographics, acute rejection, graft loss or renal function. The maximum MFI of DSA in patients with C1q-fixing DSA was significantly higher (p=0.008). Patients with DSA class-I suffered more antibody-mediated rejection (AMR) and had worse graft survival than class-II. The capacity of DSA I to fix C1q did not correlate with rejection, graft function or graft loss. CONCLUSIONS C1q testing in pretransplant sera with DSA was unable to predict acute antibody-mediated rejection or early graft loss, but the presence of DSA class I compared to DSA only class II did. Despite non-fixing complement in vitro, pretransplant C1q-negative DSA I can mediate rejection and graft loss.


Transplantation Reviews | 2012

Chronic renal allograft injury: early detection, accurate diagnosis and management

Julio Pascual; María José Pérez-Sáez; Marisa Mir; Marta Crespo

Chronic renal allograft injury (CRAI) is a multifactorial clinical/pathological entity characterised by a progressive decrease in glomerular filtration rate, generally associated with proteinuria and arterial hypertension. Classical views tried to distinguish between immunological (sensitization, low HLA compatibility, acute rejection episodes) and non-immunological factors (donor age, delayed graft function, calcineurin inhibitors [CNI] toxicity, arterial hypertension, infections) contributing to its development. Defining it as a generic idiopathic entity has precluded more comprehensive attempts for therapeutic options. Consequently, it is necessary to reinforce the diagnostic work-up to add etiopathogenetic diagnosis in any case of graft dysfunction, specially transplant vasculopathy and transplant glomerulopathy, reserving the term interstitial fibrosis and tubular atrophy (IFTA) when a case of CRAI is unspecific and no clear contributing factors or a specific etiology is possible in diagnosis. Earlier detection and intervention of CRAI remain as key challenges for transplant physicians. Changes in SCr levels and proteinuria often occur late in disease progression and may not accurately represent the underlying renal damage. Deterioration of renal function over time, determined through slope analysis, is a more accurate indicator of CRAI, and earlier identification of renal deterioration may prompt earlier changes in immunosuppressive therapies. The crucial point is probably to distinguish between nonimmunological or toxic CRAI and immunological-derived CRAI cases. Conversion to nonnephrotoxic immunosuppressants, such as mTOR inhibitors, holds promise in reducing the impact of toxic CRAI by both avoiding and reducing the impact of CNIs and reducing smooth muscle cell proliferation in the kidney. CRAI due to chronic antibody mediated rejection is an important entity, better and better defined that carries a bad prognosis and is associated with graft loss. The best prevention is adequate immunosuppression and tight patient monitoring, from the clinical, analytical and histological standpoint. While clinical trial evidence is needed for early detection and intervention in patients with CRAI, this review represents the current knowledge upon which clinicians can base their strategies. New prospective, ideally well-controlled trials are needed to establish the usefulness of different potentially therapeutic regimens. These evidences should demonstrate the benefits before extended uncontrolled use of drugs such as rituximab, bortezomib or eculizumab, which are expensive and frequently iatrogenic.


American Journal of Transplantation | 2015

Circulating NK‐Cell Subsets in Renal Allograft Recipients With Anti‐HLA Donor‐Specific Antibodies

Marta Crespo; José Yélamos; D. Redondo; Aura Muntasell; María José Pérez-Sáez; María López-Montañés; C. García; A. Torio; Marisa Mir; J. J. Hernández; Miguel López-Botet; Julio Pascual

Detection of posttransplant donor‐specific anti‐HLA antibodies (DSA) constitutes a risk factor for kidney allograft loss. Together with complement activation, NK‐cell antibody‐dependent cell mediated cytotoxicity (ADCC) has been proposed to contribute to the microvascular damage associated to humoral rejection. In the present observational exploratory study, we have tried to find a relationship of circulating donor‐specific and nondonor‐specific anti‐HLA antibodies (DSA and HLA non‐DSA) with peripheral blood NK‐cell subsets and clinical features in 393 renal allograft recipients. Multivariate analysis indicated that retransplantation and pretransplant sensitization were associated with detection of posttransplant DSA. Recipient female gender, DR mismatch and acute rejection were significantly associated with posttransplant DSA compared to HLA non‐DSA. In contrast with patients without detectable anti‐HLA antibodies, DSA and HLA non‐DSA patients displayed lower proportions of NK‐cells, associated with increased CD56bright and NKG2A+ subsets, the latter being more marked in DSA cases. These differences appeared unrelated to retransplantation, previous acute rejection or immunosuppressive therapy. Although preliminary and observational in nature, our results suggest that the assessment of the NK‐cell immunophenotype may contribute to define signatures of alloreactive humoral responses in renal allograft recipients.


Bone | 2015

Increased hip fracture and mortality in chronic kidney disease individuals: The importance of competing risks

María José Pérez-Sáez; Daniel Prieto-Alhambra; Clara Barrios; Marta Crespo; Dolores Redondo; Xavier Nogués; A Diez-Perez; Julio Pascual

BACKGROUND Many studies have shown a correlation between chronic kidney disease (CKD) and fracture. However, increased mortality in CKD patients is a competing risk scenario not accounted for in previous studies. Our aim was to investigate the true impact of CKD on hip fracture after accounting for a competing risk with death. METHODS We conducted a population-based cohort study to determine the impact of CKD on hip fractures in individuals aged ≥50years old registered in the SIDIAP(Q) database (representative of 1.9 million people in Catalonia, Spain). Cox regression was used to estimate hazard ratio (HR) for death and hip fracture according to CKD status. A competing risk (Fine and Gray) model was fitted to estimate sub-HR for hip fracture in CKD or CKD-free patients accounting for differential mortality. RESULTS A total of 873,073 (32,934 (3.8%) CKD) patients were observed for 3 years. During follow-up, 4,823 (14.6%) CKD and 36,328 (4.3%) CKD-free participants died (HR, 1.83 [95% CI, 1.78-1.89]), whilst 522 (1.59%) and 6,292 (0.75%) sustained hip fractures, respectively. Adjusted Cox models showed a significantly increased risk of hip fractures for the CKD group (HR, 1.16 [1.06-1.27]), but this association was attenuated in competing risk models accounting for mortality (SHR, 1.14 [1.03-1.27]). CONCLUSIONS Both death and hip fracture rates are increased (by 83% and 16%, respectively) in CKD patients. However, the association between CKD and hip fractures is attenuated when an excess of mortality is taken into account. A competing risk with death must be considered in future analyses of association between CKD and any health outcomes.


American Journal of Transplantation | 2016

Survival Benefit From Kidney Transplantation Using Kidneys From Deceased Donors Aged ≥75 Years: A Time-Dependent Analysis.

María José Pérez-Sáez; E. Arcos; J. Comas; Marta Crespo; J. Lloveras; Julio Pascual

Patients with end‐stage renal disease have longer survival after kidney transplantation than they would by remaining on dialysis; however, outcome with kidneys from donors aged ≥75 years and the survival of recipients of these organs compared with their dialysis counterparts with the same probability of obtaining an organ is unknown. In a longitudinal mortality study, 2040 patients on dialysis were placed on a waiting list, and 389 of them received a first transplant from a deceased donor aged ≥75 years. The adjusted risk of death and survival were calculated by non–proportional hazards analysis with being transplanted as a time‐dependent effect. Projected years of life since placement on the waiting list was almost twofold higher for transplanted patients. Nonproportional adjusted risk of death after transplantation was 0.44 (95% confidence interval [CI] 0.61–0.32; p < 0.001) in comparison with those that remained on dialysis. Stratifying by age, adjusted hazard ratios for death were 0.17 (95% CI 0.47–0.06; p = 0.001) for those aged <65 years, 0.56 (95% CI 0.92–0.34; p = 0.022) for those aged 65–69 years and 0.82 (95% CI 1.28–0.52; p = 0.389) for those aged ≥70 years. Although kidney transplantation from elderly deceased donors is associated with reduced graft survival, transplanted patients have lower mortality than those remaining on dialysis.


Journal of Immunology | 2017

Adaptive NKG2C+ NK Cell Response and the Risk of Cytomegalovirus Infection in Kidney Transplant Recipients

Dolores Redondo-Pachón; Marta Crespo; José Yélamos; Aura Muntasell; María José Pérez-Sáez; Silvia Pérez-Fernández; Joan Vila; Carlos Vilches; Julio Pascual; Miguel López-Botet

CMV infection in kidney transplant recipients (KTRs) has been associated with an increased risk for graft loss and reduced host survival. CMV promotes persistent expansions of NK cells expressing the CD94/NKG2C receptor. The NKG2C (KLRC2) gene is frequently deleted, and copy number influences the adaptive response of NKG2C+ NK cells. The distribution of NKG2C+ NK cells and NKG2C genotypes (NKG2C+/+, NKG2C+/del, NKG2Cdel/del) were studied in cross-sectional (n = 253) and prospective (n = 122) KTR cohorts. Assessment of CMV viremia was restricted to symptomatic cases in the retrospective study, but was regularly monitored in the prospective cohort. Overall, the proportions of NKG2C+ NK cells were significantly higher in KTRs who had suffered posttransplant symptomatic CMV infection in the cross-sectional study. Yet, along the prospective follow-up (3, 6, 12, and 24 mo), posttransplant NKG2C+ NK cell expansions were not observed in every patient with detectable viremia who received preemptive antiviral therapy, suggesting that the adaptive NK cell response may be inversely related with the degree of CMV control. Remarkably, the incidence of posttransplant viremia was reduced among cases with high pretransplant levels of NKG2C+ NK cells. The NKG2C genotype distribution was comparable in KTR and healthy controls, and greater proportions of NKG2C+ cells were detected in NKG2C+/+ than in NKG2C+/del patients. Yet, a trend toward increased NKG2C+/del and reduced NKG2C+/+ frequencies associated with symptomatic infection was appreciated in both cohorts. Altogether, our results indirectly support that adaptive NKG2C+ NK cells are involved in the control of CMV in KTRs.


Transplantation | 2017

Bone Density, Microarchitecture, and Tissue Quality Long-term After Kidney Transplant.

María José Pérez-Sáez; Sabina Herrera; Daniel Prieto-Alhambra; Xavier Nogués; María Vera; Dolores Redondo-Pachón; Marisa Mir; Roberto Güerri; Marta Crespo; A Diez-Perez; Julio Pascual

Background Bone mineral density (BMD) measured by dual-energy x-ray absorptiometry is used to assess bone health in kidney transplant recipients (KTR). Trabecular bone score and in vivo microindentation are novel techniques that directly measure trabecular microarchitecture and mechanical properties of bone at a tissue level and independently predict fracture risk. We tested the bone status of long-term KTR using all 3 techniques. Methods Cross-sectional study including 40 KTR with more than 10 years of follow-up and 94 healthy nontransplanted subjects as controls. Bone mineral density was measured at lumbar spine and the hip. Trabecular bone score was measured by specific software on the dual-energy x-ray absorptiometry scans of lumbar spine in 39 KTR and 77 controls. Microindentation was performed at the anterior tibial face with a reference-point indenter device. Bone measurements were standardized as percentage of a reference value, expressed as bone material strength index (BMSi) units. Multivariable (age, sex, and body mass index-adjusted) linear regression models were fitted to study the association between KTR and BMD/BMSi/trabecular bone score. Results Bone mineral density was lower at lumbar spine (0.925 ± 0.15 vs 0.982 ± 0.14; P = 0.025), total hip (0.792 ± 0.14 vs 0.902 ± 0.13; P < 0.001), and femoral neck (0.667 ± 0.13 vs 0.775 ± 0.12; P < 0.001) in KTR than in controls. BMSi was also lower in KTR (79.1 ± 7.7 vs 82.9 ± 7.8; P = 0.012) although this difference disappeared after adjusted model (P = 0.145). Trabecular bone score was borderline lower (1.21 ± 0.14 vs 1.3 ± 0.15; adjusted P = 0.072) in KTR. Conclusions Despite persistent decrease in BMD, trabecular microarchitecture and tissue quality remain normal in long-term KTR, suggesting important recovery of bone health.


Transplantation | 2017

Strategies for an Expanded Use of Kidneys From Elderly Donors

María José Pérez-Sáez; Nuria Montero; Dolores Redondo-Pachón; Marta Crespo; Julio Pascual

Abstract The old-for-old allocation policy used for kidney transplantation (KT) has confirmed the survival benefit compared to remaining listed on dialysis. Shortage of standard donors has stimulated the development of strategies aimed to expand acceptance criteria, particularly of kidneys from elderly donors. We have systematically reviewed the literature on those different strategies. In addition to the review of outcomes of expanded criteria donor or advanced age kidneys, we assessed the value of the Kidney Donor Profile Index policy, preimplantation biopsy, dual KT, machine perfusion and special immunosuppressive protocols. Survival and functional outcomes achieved with expanded criteria donor, high Kidney Donor Profile Index or advanced age kidneys are poorer than those with standard ones. Outcomes using advanced age brain-dead or cardiac-dead donor kidneys are similar. Preimplantation biopsies and related scores have been useful to predict function, but their applicability to transplant or refuse a kidney graft has probably been overestimated. Machine perfusion techniques have decreased delayed graft function and could improve graft survival. Investing 2 kidneys in 1 recipient does not make sense when a single KT would be enough, particularly in elderly recipients. Tailored immunosuppression when transplanting an old kidney may be useful, but no formal trials are available. Old donors constitute an enormous source of useful kidneys, but their retrieval in many countries is infrequent. The assumption of limited but precious functional expectancy for an old kidney and substantial reduction of discard rates should be generalized to mitigate these limitations.


Transplant Immunology | 2016

Impact of preformed and de novo anti-HLA DP antibodies in renal allograft survival.

Dolores Redondo-Pachón; Julio Pascual; María José Pérez-Sáez; Carmen García; Juan José Hernández; Javier Gimeno; Marisa Mir; Marta Crespo

The influence of antibodies against HLA-DP antigens detected with solid-phase assays on graft survival after kidney transplantation (KT) is uncertain. We evaluated with Luminex® the prevalence of pre- and posttransplant DP antibodies in 440 KT patients and their impact on graft survival. For 291 patients with available pretransplant samples, DP antibodies were present in 39.7% KT with pretransplant HLA antibodies and 47.7% with DSA. Graft survival of KT with pretransplant class-II DSA was worse than with non-DSA (p=0.01). DP antibodies did not influence graft survival. Of 346 patients monitored post-KT, 17.1% had HLA class-II antibodies, 56% with DP antibodies. Class-II DSA was detected in 39%, 60.9% of them had DP antibodies. Graft survival was worse in patients with class-II DSA (p=0.022). DP antibodies did not change these results. The presence of isolated DP antibodies was a rare event both pre- and posttransplantation (1.03 and 0.86%). The presence of pretransplant and posttransplant DSA is associated with a negative impact on graft survival. However, the presence of DP antibodies does not modify this impact significantly.


Nefrologia | 2016

El Kidney Donor Profile Index: ¿se puede extrapolar a nuestro entorno?

Julio Pascual; María José Pérez-Sáez

Es frecuente que nos encontremos ante el dilema de aceptar o no un riñón aparentemente no óptimo en un paciente que lleva un determinado tiempo en diálisis y desea ser trasplantado. La evaluación de la «calidad» del riñón sigue siendo muy controvertida. El concepto más simple es la edad. La edad del donante es un factor que limita la supervivencia del riñón y, aunque sabemos que, a mayor edad, peor supervivencia1, también sabemos que los riñones de más edad pueden ser muy ventajosos para los pacientes si lo comparamos con su permanencia en diálisis sin trasplantar2,3. A principios de los años 2000, en EE. UU. se desarrolló el concepto de «donante con criterio expandido» (ECD), en el que además de la edad se incluían otras 3 variables clínicas: la creatinina sérica preextracción, la causa de la muerte (cerebrovascular o no) y la historia previa de hipertensión arterial4. Un riñón ECD tenía una supervivencia entre un 70% y un 168% peor que un riñón procedente de un donante estándar (SCD). Durante más de una década, en todo el mundo se ha utilizado esta distinción, aunque en España nunca se ha llegado al extremo de desarrollar un consentimiento informado específico para este tipo de riñones, que constituyen en muchos programas más de la mitad de los disponibles. En nuestro entorno, el uso de la distinción ECD-SCD se ha limitado a cuestiones científicas, sin un impacto real a nivel asistencial u operativo. Curiosamente, tanto en EE. UU. como en España, la causa más frecuente que lleva a no utilizar un riñón extraído es el estudio histológico en la biopsia preimplante, que no aporta

Collaboration


Dive into the María José Pérez-Sáez's collaboration.

Top Co-Authors

Avatar

Julio Pascual

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar

Marisa Mir

Autonomous University of Barcelona

View shared research outputs
Top Co-Authors

Avatar

Dolores Redondo-Pachón

Autonomous University of Barcelona

View shared research outputs
Top Co-Authors

Avatar

Carla Burballa

Autonomous University of Barcelona

View shared research outputs
Top Co-Authors

Avatar

Carlos Arias-Cabrales

Autonomous University of Barcelona

View shared research outputs
Top Co-Authors

Avatar

Javier Gimeno

Autonomous University of Barcelona

View shared research outputs
Top Co-Authors

Avatar

José Portolés

Instituto de Salud Carlos III

View shared research outputs
Top Co-Authors

Avatar

Sheila Bermejo

Autonomous University of Barcelona

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

A Diez-Perez

Autonomous University of Barcelona

View shared research outputs
Researchain Logo
Decentralizing Knowledge