Παρασκευή 16 Αυγούστου 2019

Incidence and Risks for Non Alcoholic Fatty Liver Disease and Steatohepatitis Post Liver Transplant: Systematic Review and Meta-analysis
Background: The true incidence and unique risk factors for recurrent and de novo nonalcoholic fatty liver (NAFLD) and nonalcoholic steatohepatitis (NASH) post liver transplant (LT) remain poorly characterized. We aimed to identify the incidence and risk factors for recurrent and de novo NAFLD/NASH post-LT. Methods: MEDLINE via PubMed, Embase, Scopus and CINAHL were searched for studies from 2000- 2018. Risk of bias was adjudicated using the Newcastle-Ottawa Scale. Results: Seventeen studies representing 2,378 patients were included. All were retrospective analyses of patients with post-LT liver biopsies, with the exception of two studies that used imaging for outcome assessment. Seven studies evaluated recurrent, three de novo, and seven included both. In studies at generally high or moderate risk of bias, mean 1, 3 and ≥5-year incidence rates may be: 59%, 57 %, and 82% for recurrent NAFLD; 67%, 40%, and 78% for de novo NAFLD; Mean 1, 3 and ≥5-year incidence rates may be: 53%, 57.4%, and 38% for recurrent NASH; 13%, 16%, and 17% for de novo NASH. Multivariate analysis demonstrated that post-LT BMI (summarized OR 1.27) and hyperlipidemia were the most consistent predictors of outcomes. Conclusions: There is low confidence in the incidence of recurrent and de novo NAFLD and NASH after LT due to study heterogeneity. Recurrent and de novo NAFLD may occur in over half of recipients as soon as 1 year after LT. NASH recurs in most patients after LT, while de novo NASH occurs rarely. NAFLD/NASH after LT are associated with metabolic risk factors. Disclosure: The authors declare no conflicts of interest Funding: AASLD Clinical Translational Outcomes and Research Award (MT) Corresponding Author: Monica A. Tincopa, MD MSc; University of Michigan Health System, Division of Gastroenterology and Hepatology; 3912 Taubman Center, 1500 East Medical Center Drive, SPC 5362; Ann Arbor, Michigan, 48109, USA. Email: tincopa@med.umich.edu Copyright © 2019 Wolters Kluwer Health, Inc. All rights reserved.
mTOR Inhibitor Therapy Diminishes Circulating CD8+ CD28- Effector Memory T Cells and Improves Allograft Inflammation in Belatacept-Refractory Renal Allograft Rejection
Background: Renal allograft rejection is more frequent under belatacept-based, compared to tacrolimus-based, immunosuppression. We studied kidney transplant recipients experiencing rejection under belatacept-based early corticosteroid withdrawal following T cell depleting induction in a recent randomized trial (BEST Trial, clinicaltrials.gov #NCT01729494) to determine mechanisms of rejection and treatment. Methods: Peripheral mononuclear cells, serum creatinine levels, and renal biopsies were collected from 8 patients undergoing belatacept-refractory rejection. We used flow cytometry, histology and immunofluorescence to characterize CD8+ effector memory T cell (TEM) populations in the periphery and graft before and after mammalian target of rapamycin (mTOR) inhibition. Results: Here, we found that patients with belatacept-refractory rejection (BRR) did not respond to standard antirejection therapy and had a substantial increase in alloreactive CD8+ T cells with a CD28low/DRhi/CD38hi/CD45RO+ TEM. These cells had increased activation of the mTOR pathway, as assessed by phosphorylated ribosomal protein S6 (p-RPS6) expression. Notably, everolimus (an mTOR inhibitor) treatment of patients with BRR halted the in vivo proliferation of TEM cells, their ex vivo alloreactivity, and resulted in their significant reduction in the peripheral blood. The frequency of circulating FoxP3+ regulatory T cells was not altered. Importantly, everolimus led to rapid resolution of rejection as confirmed by histology. Conclusions: Thus, while prior work has shown that concomitant belatacept + mTOR inhibitor therapy is effective for maintenance immunosuppression, our preliminary data suggest that everolimus may provide an available means for effecting “rescue” therapy for rejections occurring under belatacept that are refractory to traditional antirejection therapy with corticosteroids and polyclonal antilymphocyte globulin. * These two authors contributed equally DISCLOSURE: The authors of this manuscript have no conflict of interest to disclose. FUNDING: This research was supported by an Academic Research Center grant for Cincinnati Children’s Hospital, a collaborative pilot grant from the Institutional Clinical and Translational Science Award from the NIH (ULITR000077), and an NIH R21 AI142264 (D.A.H.). Corresponding author emails: David A. Hildeman, E. Steve Woodle, David.Hildeman@cchmc.org, woodles@ucmail.uc.edu Copyright © 2019 Wolters Kluwer Health, Inc. All rights reserved.
The First Asian Kidney Transplantation Prediction Models for Long-Term Patient and Allograft Survival
Introduction: Several kidney transplantation (KT) prediction models for patient and graft outcomes have been developed based on Caucasian populations. However, KT in Asian countries differs due to patient characteristics and practices. To date, there has been no equation developed for predicting outcomes amongst Asian KT recipients. Methods: We developed equations for predicting 5- and 10-years patient survival (PS) and death-censored graft survival (DCGS) based on 6,662 patients in the Thai Transplant Registry. The cohort was divided into training and validation datasets. We identified factors significantly associated with outcomes by Cox regression. In the validation dataset, we also compared our models with another model based on KT in the United States (US). Results: Variables included for developing the DCGS and PS models were recipient and donor age, background kidney disease, dialysis vintage, donor hepatitis C virus status, cardiovascular diseases, panel reactive antibody, donor types, donor creatinine, ischemic time, and immunosuppression regimens. The C-statistics of our model in the validation dataset were 0.69 (0.66-0.71) and 0.64 (0.59-0.68) for DCGS and PS. Our model performed better when compared with a model based on US patients. Compared with tacrolimus, KT recipients aged ≤ 44 years receiving cyclosporine A (CsA) had a higher risk of graft loss (adjusted HR 1.26, p=0.046). The risk of death was higher in recipients aged > 44 years and taking CsA (adjusted HR 1.44, p=0.011). Conclusions: Our prediction model is the first based on an Asian population, can be used immediately after transplantation. The model can be accessed at www.nephrochula.com/ktmodels. Conflict of interest: All authors declare no conflict of interest. Funding This study received grant from the Thai Transplantation Society (approval number 41/2561). Corresponding author: Natavudh Townamchai, MD, Division of Nephrology, Department of Medicine, Faculty of Medicine, Chulalongkorn University and King Chulalongkorn Memorial Hospital, Bangkok, Thailand 10330, Tel: +66817149750, Email: ntownamchai@gmail.com Copyright © 2019 Wolters Kluwer Health, Inc. All rights reserved.
The TOMATO study (TacrOlimus MetabolizAtion in kidney TransplantatiOn): impact of the concentration-dose ratio on death-censored graft survival
Background: Tacrolimus trough concentrations (mean/variability), as well as concentration-to-dose ratio (CDr), affect kidney-allograft outcomes. We investigated the link between the CDr and death-censored kidney-graft survival (DCGS). Methods: We performed a retrospective study on 1029 kidney transplant patients (2004-2016) with the following criteria: tacrolimus-based immunosuppression, >1-year graft survival, no initial use of everolimus, available anti-HLA antibody data. We analyzed the impact of the time-varying CDr on DCGS. Fast metabolizers were defined by a CDr<1.05. We also investigated the effect of an early (Month-3 to Month-6 post transplantation) CDr below 1.05. Cox survival analyses were performed, adjusting for potential confounders (tacrolimus trough, variability of tacrolimus trough, de novo DSA development, CYP3A5 genotype, pregraft sensitization, Month-3 GFR). Results: Time-varying CDr was significantly associated with DCGS (HR 2.35, p<0.001) in a univariate model, on the full analysis set comprising 1029 patients. In the multivariate time-varying model, based on 666 patients with available CYP3A5 genotypes, the effect of the CDr remained significant (HR 2.26, p=0.015), even when GFR at month-3 < 30 ml/min/1,73m2 (HR 2.61, p=.011) dnDSA development (HR 4.09, p<0.001) and continued steroid prescription (HR=2.08, p=0.014) were taken into account (other covariates, including tacrolimus trough concentrations, were nonsignificant). In the same multivariate model, the effect of early CDr (median at M3 and M6) remained significantly associated with DCGS (HR=2.25, p=0.041). Conclusion: CDr is an independent and early predictor of DCGS. Identification of fast metabolizers could be a strategy to improve graft survival, e.g., by optimizing tacrolimus formulation. Mechanistic studies to understand the CDr effect are required. The authors have no financial conflict of interest to disclose. The authors declare no conflict of interest. Corresponding author: Lionel Rostaing, MD, PhD, Service de Néphrologie, dialyse, aphérèses et transplantation rénale, CHU Grenoble Alpes, CS 10217, 38043 Grenoble Cedex 09, France. Tel: +33 4 76 76 54 60, Fax +33 4 76 76 52 63, Email: lrostaing@chu-grenoble.fr Copyright © 2019 Wolters Kluwer Health, Inc. All rights reserved.
Poor Outcomes with the Use of Checkpoint Inhibitors in Kidney Transplant Recipients.
Background: Checkpoint inhibitors are now frequently used for oncologic conditions. The impact of these therapies in solid organ transplant recipients was not assessed in clinical trials. Subsequent case reports highlight the major detrimental interactions of checkpoint inhibitors and the high risk of allograft rejection with their use, although patient outcomes have not yet been assessed in long-term follow-up. Methods: We conducted a retrospective review of kidney transplant recipients with metastatic cancer who received checkpoint inhibitors at a single center between April 2015 and May 2018. Results: Six kidney transplant recipients with metastatic cancers that were not responding to first-line treatments met study criteria. These include two with squamous cell cancers, two with melanoma, one with renal cell cancer, and one with adenocarcinoma of the lung. Four patients received anti-programmed cell death protein-1 (PD-1) antibody and two received a combination of anti-cytotoxic T-lymphocyte-associated protein 4 (CTLA-4) and anti-PD-1 antibodies. Three out of six patients developed acute kidney injury. Two were biopsy-proven acute rejections with subsequent graft failures. The third was attributed to rejection, but improved after discontinuing the checkpoint inhibitor. Five out of six patients had cancer progression and only one patient had remission. Conclusion: Providers and patients need to be aware of the high risk of rejection and the poor remission rate with the use of checkpoint inhibitors in kidney transplant patients. More research is warranted to assess the optimal maintenance immunosuppression during the use of checkpoint inhibitor therapy that would not diminish the chances of remission. DISCLOSURE: The authors declare no conflicts of interest. Corresponding Author: Tarek Alhamad, MD, MS, Division of Nephrology, Department of Internal Medicine, Transplant Epidemiology Research Collaboration (TERC), Institute of Public Health, Washington University School of Medicine in St. Louis, St. Louis, MO, talhamad@wustl.edu Copyright © 2019 Wolters Kluwer Health, Inc. All rights reserved.
Donor-recipient sex differences do not affect survival outcomes after kidney transplantation: a population cohort study
Background. Donor factors can influence decision-making for organ utilisation for potential kidney transplant candidates. Prior studies exploring the effect of donor-recipient sex matching on kidney transplant outcomes have reported heterogenous and conflicting results. The aim of this contemporary population-cohort analysis was to explore the effect of donor-recipient sex matching on kidney transplant outcomes in the United Kingdom. Methods. In this retrospective, observational study we analysed all patients receiving kidney-alone transplants between 2003 and 2018 using UK Transplant Registry data. Stratified by recipient sex, outcomes were compared between male and female donors with univariable/multivariable analyses. Results. Data was analysed for 25,140 recipients. Of these, 13,414 (53.4%) of kidneys were from male donors and 15,690 (62.4%) of recipients were male. The odds of initial graft dysfunction (delayed graft function/primary nonfunction) were significantly lower for female donor kidneys transplanted into both male (adjusted OR 0.89, 95% CI: 0.80-0.98, p=0.019) and female (adjusted OR 0.81, 95% CI: 0.71-0.93, p=0.003) recipients. Male recipients of female donor kidneys had creatinine levels at one year that were 6.3% higher (95% CI: 4.8%-7.7%, p<0.001) than male recipients of male donor kidneys, with a similar sex difference of 4.1% (95% CI: 2.1%-6.1%, p<0.001) observed within female recipients. However, neither patient nor graft survival were found to differ significantly by donor sex on either univariable or multivariable analysis. Conclusions. Our data provides contemporary data on sex mismatch for recipient counselling and reassurance with regards to equivalent long-term clinical outcomes based upon donor sex. Disclosures: The authors have no relevant disclosures to make. Funding: No funding to disclose. Corresponding author contact information. Dr. Adnan Sharif, Department of Nephrology and Transplantation, Queen Elizabeth Hospital, Edgbaston, Birmingham, B15 2WB, United Kingdom, Phone: (0121 371 5861), Fax: (0121 472 4942), Email: adnan.sharif@uhb.nhs.uk, Twitter handle: @AdnanSharif1979 Copyright © 2019 Wolters Kluwer Health, Inc. All rights reserved.
Tacrolimus Intra-Patient Variability, Time in Therapeutic Range, and Risk of De Novo Donor-Specific Antibodies
Background: Tacrolimus (TAC) is the most important agent for maintenance immunosuppression and prevention of immunologic injury to the renal allograft, yet there remains no consensus on how best to monitor drug therapy. Both high TAC intra-patient variability and low TAC time in therapeutic range (TTR) have been associated with risk of de novo donor-specific antibodies (dnDSA). In this study, we hypothesized that the risk associated with high TAC coefficient of variation (CV) is a result of low TAC TTR rather than the variability itself. Methods: We analyzed the risk of de novo donor-specific antibodies, acute rejection, or death-censored graft loss by non dosed corrected TAC CV and TAC TTR during the first posttransplant year in a cohort of 538 patients with a median follow up period of 4.1 years. Results: Patients with CV > 44.2% and TTR < 40% (high intra-patient variability and low time in therapeutic range) had a high risk of dnDSA (aOR 4.93, 95% CI 2.02-12.06, p<0.001) and death-censored graft loss by 5 years (aHR 4.00, 95% CI 1.31-12.24, p=0.015) when compared to patients with CV > 44.2% and TTR > 40% (high intra-patient variability and optimal time in therapeutic range), while the latter patients had similar risk to patients with CV < 44.2% (lower intra-patient variability). Conclusion: These data suggest previously reported immunologic risk associated with high TAC intra-patient variability is due to time outside of therapeutic range rather than variability in and of itself when evaluating absolute non dose corrected TAC levels irrespective of reason or indication. Discussion: There is a growing recognition that intermediate and long-term allograft loss is largely due to immunologic injury, highlighting the need to better understand immunosuppression dosing and methods of monitoring drug therapy to improve graft survival. There have been many publications in the past several years evaluating the risk of tacrolimus intra-patient variability on adverse outcomes, often expressed as tacrolimus coefficient of variation. In this manuscript, we offer evidence that the risks previously associated with tacrolimus intra-patient variability are due to low tacrolimus time in therapeutic range and not the variability itself, also demonstrating that time in therapeutic range may be a clinically practical tool for monitoring therapy. Disclosure: The authors of this manuscript have no potential conflicts of interest to disclose. Funding: The authors of this manuscript have no funding for this work to disclose. Corresponding Author: Scott Davis, MD, scott.davis@ucdenver.edu, Mail Stop F749 Anschutz Outpatient Pavilion , 1635 Aurora Court, 7th Floor, Aurora, CO 80045 Copyright © 2019 Wolters Kluwer Health, Inc. All rights reserved.
Delayed graft function in simultaneous liver kidney transplantation
Background: Delayed graft function (DGF) is associated with inferior posttransplant outcomes in kidney transplantation. Given these adverse outcomes, we sought to determine the incidence, unique risk factors, and posttransplant outcomes for simultaneous liver kidney (SLK) transplant recipients developing DGF. Methods: We studied 6214 adult SLK recipients from March 2002 to February 2017 using the Scientific Registry of Transplant Recipients. We determined associations between risk factors and DGF using Poisson multivariate regression and between DGF and graft failure and mortality using Cox proportional hazard analysis. Results: The overall rate of DGF was 21.8%. Risk factors for DGF in the HCV negative recipient population included pretransplant dialysis (aIRR 3.26, p=0.004), donor BMI (aIRR 1.25 per 5 kg/m2, p = 0.01) and transplantation with a DCD (aIRR 5.38, p=0.001) or imported donor organ (regional share aIRR 1.69, p=0.03; national share aIRR 4.82, p<0.001). DGF was associated with a 2.6-fold increase in kidney graft failure (aHR 2.63, p<0.001), 1.6-fold increase in liver graft failure (aHR 1.62, p<0.001), and 1.6-fold increase in mortality (aHR 1.62, p<0.001). Conclusions: In HCV negative SLK recipients, recipient pretransplant dialysis and components of kidney graft quality comprise significant risk factors for DGF. Regardless of HCV status, DGF is associated with inferior posttransplant outcomes. Understanding these risk factors during clinical decision making may improve prevention of DGF and may represent an opportunity to improve posttransplant outcomes. ACKNOWLEDGMENTS:Funding for this study was provided by the National Institute of Diabetes and Digestive and Kidney Disease and the National Institute on Aging: grant numbers K24DK101828 (PI: Dorry Segev), K23DK115908 (PI: Jacqueline Garonzik Wang), and F32AG053025 (PI: Christine Haugen). The analyses described here are the responsibility of the authors alone and do not necessarily reflect the views or policies of the Department of Health and Human Services, nor does mention of trade names, commercial products or organizations imply endorsement by the U.S. Government. The data reported here have been supplied by the Minneapolis Medical Research Foundation (MMRF) as the contractor for the Scientific Registry of Transplant Recipients (SRTR). The interpretation and reporting of these data are the responsibility of the author(s) and in no way should be seen as an official policy of or interpretation by the SRTR or the U.S. Government. DISCLOSURE:The authors of this manuscript have no conflicts of interest to disclose. Corresponding author: Jacqueline Garonzik Wang, M.D., Ph.D., Assistant Professor, Department of Surgery, Johns Hopkins Medical Institutions, 720 Rutland Ave, Ross 771, Baltimore, MD 21205, 410-502-5198 (tel) 410-510-1514 (fax), jgaronz1@jhmi.edu Copyright © 2019 Wolters Kluwer Health, Inc. All rights reserved.
Cold pulsatile machine perfusion versus static cold storage for kidneys donated after circulatory death: a multicenter randomized controlled trial
Background: The benefits of cold pulsatile machine perfusion for the storage and transportation of kidneys donated after circulatory-death (DCD) are disputed. We conducted a UK-based multicenter, randomized controlled trial to compare outcomes of kidneys stored with machine perfusion (MP) versus static cold storage (CS). Methods: 51 pairs of DCD donor kidneys were randomly allocated to receive static cold storage or cold pulsatile machine perfusion. The primary endpoint, delayed graft function (DGF), was analyzed by ‘intention-to-treat’ evaluation. Results: There was no difference in the incidence of DGF between CS and MP (32/51 (62.8%) and 30/51 (58.8%) p= 0.69, respectively), although the trial stopped early due to difficulty with recruitment. There was no difference in the incidence of acute rejection, or in graft or patient survival between the CS and MP groups. Median eGFR at 3 months following transplantation was significantly lower in the CS group compared to MP (CS 34 mL/min IQR 26-44 vs MP 45 ml/min IQR 36-60, p = 0.006), although there was no significant difference in eGFR between CS and MP at 12 months post transplant. Conclusion: This study is underpowered, which limits definitive conclusions about the use of machine perfusion, as an alternative to static cold storage. It did not demonstrate that the use of machine perfusion reduces the incidence of delayed graft function in donation after circulatory death kidney transplantation. ISRCTN50082383 ISRCTN Trial Number: ISRCTN50082383 Conflicts of Interest: LVR is a current employee of Organox Ltd CJEW has worked on advisory board for GSK, and has received hospitality from TEVA and Organox Funding: This trial was supported by a PhD grant (UKT07-02) from NHS Blood and Transplant. Additional support was provided by NIHR BRC Cambridge and the study was on the NIHR Portfolio. Corresponding Author: Dr Dominic Summers, Department of Surgery, Box 202, Addenbrooke’s Hospital, Cambridge, CB2 0QQ, dms39@cam.ac.uk Copyright © 2019 Wolters Kluwer Health, Inc. All rights reserved.
Seeing the Forest for the Trees: Random Forest Models for Predicting Survival in Kidney Transplant Recipients
No abstract available

Δεν υπάρχουν σχόλια:

Δημοσίευση σχολίου

Αρχειοθήκη ιστολογίου