Κυριακή 11 Αυγούστου 2019

Delayed graft function in simultaneous liver kidney transplantation
Background: Delayed graft function (DGF) is associated with inferior posttransplant outcomes in kidney transplantation. Given these adverse outcomes, we sought to determine the incidence, unique risk factors, and posttransplant outcomes for simultaneous liver kidney (SLK) transplant recipients developing DGF. Methods: We studied 6214 adult SLK recipients from March 2002 to February 2017 using the Scientific Registry of Transplant Recipients. We determined associations between risk factors and DGF using Poisson multivariate regression and between DGF and graft failure and mortality using Cox proportional hazard analysis. Results: The overall rate of DGF was 21.8%. Risk factors for DGF in the HCV negative recipient population included pretransplant dialysis (aIRR 3.26, p=0.004), donor BMI (aIRR 1.25 per 5 kg/m2, p = 0.01) and transplantation with a DCD (aIRR 5.38, p=0.001) or imported donor organ (regional share aIRR 1.69, p=0.03; national share aIRR 4.82, p<0.001). DGF was associated with a 2.6-fold increase in kidney graft failure (aHR 2.63, p<0.001), 1.6-fold increase in liver graft failure (aHR 1.62, p<0.001), and 1.6-fold increase in mortality (aHR 1.62, p<0.001). Conclusions: In HCV negative SLK recipients, recipient pretransplant dialysis and components of kidney graft quality comprise significant risk factors for DGF. Regardless of HCV status, DGF is associated with inferior posttransplant outcomes. Understanding these risk factors during clinical decision making may improve prevention of DGF and may represent an opportunity to improve posttransplant outcomes. ACKNOWLEDGMENTS:Funding for this study was provided by the National Institute of Diabetes and Digestive and Kidney Disease and the National Institute on Aging: grant numbers K24DK101828 (PI: Dorry Segev), K23DK115908 (PI: Jacqueline Garonzik Wang), and F32AG053025 (PI: Christine Haugen). The analyses described here are the responsibility of the authors alone and do not necessarily reflect the views or policies of the Department of Health and Human Services, nor does mention of trade names, commercial products or organizations imply endorsement by the U.S. Government. The data reported here have been supplied by the Minneapolis Medical Research Foundation (MMRF) as the contractor for the Scientific Registry of Transplant Recipients (SRTR). The interpretation and reporting of these data are the responsibility of the author(s) and in no way should be seen as an official policy of or interpretation by the SRTR or the U.S. Government. DISCLOSURE:The authors of this manuscript have no conflicts of interest to disclose. Corresponding author: Jacqueline Garonzik Wang, M.D., Ph.D., Assistant Professor, Department of Surgery, Johns Hopkins Medical Institutions, 720 Rutland Ave, Ross 771, Baltimore, MD 21205, 410-502-5198 (tel) 410-510-1514 (fax), jgaronz1@jhmi.edu Copyright © 2019 Wolters Kluwer Health, Inc. All rights reserved.
Cold pulsatile machine perfusion versus static cold storage for kidneys donated after circulatory death: a multicenter randomized controlled trial
Background: The benefits of cold pulsatile machine perfusion for the storage and transportation of kidneys donated after circulatory-death (DCD) are disputed. We conducted a UK-based multicenter, randomized controlled trial to compare outcomes of kidneys stored with machine perfusion (MP) versus static cold storage (CS). Methods: 51 pairs of DCD donor kidneys were randomly allocated to receive static cold storage or cold pulsatile machine perfusion. The primary endpoint, delayed graft function (DGF), was analyzed by ‘intention-to-treat’ evaluation. Results: There was no difference in the incidence of DGF between CS and MP (32/51 (62.8%) and 30/51 (58.8%) p= 0.69, respectively), although the trial stopped early due to difficulty with recruitment. There was no difference in the incidence of acute rejection, or in graft or patient survival between the CS and MP groups. Median eGFR at 3 months following transplantation was significantly lower in the CS group compared to MP (CS 34 mL/min IQR 26-44 vs MP 45 ml/min IQR 36-60, p = 0.006), although there was no significant difference in eGFR between CS and MP at 12 months post transplant. Conclusion: This study is underpowered, which limits definitive conclusions about the use of machine perfusion, as an alternative to static cold storage. It did not demonstrate that the use of machine perfusion reduces the incidence of delayed graft function in donation after circulatory death kidney transplantation. ISRCTN50082383 ISRCTN Trial Number: ISRCTN50082383 Conflicts of Interest: LVR is a current employee of Organox Ltd CJEW has worked on advisory board for GSK, and has received hospitality from TEVA and Organox Funding: This trial was supported by a PhD grant (UKT07-02) from NHS Blood and Transplant. Additional support was provided by NIHR BRC Cambridge and the study was on the NIHR Portfolio. Corresponding Author: Dr Dominic Summers, Department of Surgery, Box 202, Addenbrooke’s Hospital, Cambridge, CB2 0QQ, dms39@cam.ac.uk Copyright © 2019 Wolters Kluwer Health, Inc. All rights reserved.
Seeing the Forest for the Trees: Random Forest Models for Predicting Survival in Kidney Transplant Recipients
No abstract available
Equity-efficiency trade-offs associated with alternative approaches to deceased donor kidney allocation: a patient-level simulation
Background: The number of patients waiting to receive a kidney transplant outstrips the supply of donor organs. We sought to quantify trade-offs associated with different approaches to deceased donor kidney allocation in terms of quality-adjusted life years (QALYs), costs and access to transplantation. Methods: An individual patient simulation model was developed to compare five different approaches to kidney allocation, including the 2006 UK National Kidney Allocation Scheme (NKAS) and a QALY-maximisation approach designed to maximise health gains from a limited supply of donor organs. We used various sources of patient-level data to develop multivariable regression models to predict survival, health-state utilities and costs. We simulated the allocation of kidneys from 2200 deceased donors to a waiting list of 5500 patients and produced estimates of total lifetime costs and QALYs for each allocation scheme. Results: Among patients who received a transplant, the QALY-maximisation approach generated 48,045 QALYs and cost £681 million while the 2006 NKAS generated 44,040 QALYs and cost £625 million. When also taking into consideration outcomes for patients who were not prioritised to receive a transplant, the 2006 NKAS produced higher total QALYs and costs and an incremental cost-effectiveness ratio of £110,741/QALY compared to the QALY-maximisation approach. Conclusions: Compared to the 2006 NKAS, a QALY-maximisation approach makes more efficient use of deceased donor kidneys but reduces access to transplantation for older patients and results in greater inequity in the distribution of health gains between patients who receive a transplant and patients who remain on the waiting list. Disclosure: The authors declare no conflicts of interests. Funding: This work was funded by a National Institute for Health Research (NIHR) Programme Grant for Applied Research (RP-PG-0109-10116). CORRESPONDENCE TO: Bernadette Li, Department of Health Services Research and Policy, London School of Hygiene and Tropical Medicine, 15-17 Tavistock Place, London WC1H 9SH, UK. E-mail: bernadette.li@lshtm.ac.uk Copyright © 2019 Wolters Kluwer Health, Inc. All rights reserved.
Patient survival after kidney transplantation: important role of graft-sustaining factors as determined by predictive modeling using random survival forest analysis
Background. Identification of the relevant factors for death can improve patient’s individual risk assessment and decision making. A well-documented patient cohort (n=892) in a renal transplant program with protocol biopsies was used to establish multivariable models for risk assessment at 3 and 12 months posttransplantation by random survival forest analysis. Methods. Patients transplanted between 2000 and 2007 were observed up to 11 years. Loss to follow-up was negligible (n=15). 2251 protocol biopsies and 1214 biopsies for cause were performed. All rejections and clinical borderline rejections in protocol biopsies were treated. Results. 10-year patient survival was 78%, with inferior survival of patients with graft loss. Using all pre and posttransplant variables until 3 and 12 months (n=65), the obtained models showed good performance to predict death (concordance index: 0.77-0.78). Validation with a separate cohort of patients (n=349) showed a concordance index of 0.76 and good discrimination of risks by the models, despite substantial differences in clinical variables. Random survival forest analysis produced robust models over a wide range of parameter settings. Besides well-established risk factors like age, cardiovascular disease, type 2 diabetes, and graft function, posttransplant urinary tract infection and rejection treatment were important factors. Urinary tract infection and rejection treatment were not specifically associated with death due to infection or malignancy but correlated strongly with inferior graft function and graft loss. Conclusions. The established models indicate the important areas that need special attention in the care of renal transplant patients, particularly modifiable factors like graft rejection and urinary tract infection. Disclosure: The authors declare no conflicts of interest. Funding: This work was supported by intramural funding. Address for correspondence: Irina Scheffner, Nephrology, Hannover Medical School, Carl-Neuberg-Str. 1, 30625 Hannover, Germany. Phone: +49 511 532-6320, Fax: +49 511 552366, email: scheffner.irina@mh-hannover.de Copyright © 2019 Wolters Kluwer Health, Inc. All rights reserved.
NUMBER OF ANTIBODY VERIFIED EPLET IN HLA-C LOCUS AS AN INDEPENDENT FACTOR OF T CELL MEDIATED REJECTION AFTER LIVER TRANSPLANTATION
Background: HLA mismatching is a risk factor for graft rejection in solid organ transplantation. Its definition is being rethought with the introduction of the eplets in organ allocation. The eplets are highly polymorphic regions of the HLA molecule that help to explain crossreactivity of HLA antigens. The effect of eplet mismatch is well documented in renal and lung transplantation but there is no clear evidence in liver transplantation. Methods: 43 consecutive liver-graft donor/recipient pairs performed at our center from 2016 to 2018 were HLA typed. The quantification of antibody verified eplets (VerEp) mismatch was performed with HLA-matchmaker 2.1 version. Results: A total of 9 patients suffered an episode of T-cell mediated rejection (TCMR). No significant differences were observed in the number of A, B, DRB, DQA and DQB VerEp. However, the mean of mismatches VerEp in locus C (VerEpC) was significantly increased in the patients with AR: 3.89 (1.36) vs 2.32 (1.82), p=0.021. A total of 22 patients with high load of VerEpC (>2) had an increased risk of TCMR (p=0.008). The time of TCMR-free after liver transplant was statistically reduced in high-load VerEpC group (log rank test p=0.019). Multivariate analysis demonstrated that high-load of VerEpC was independently associated with TCMR (p=0.038). Conclusions: Patients with no or one eplet mismatch at the C locus are less likely to suffer TCMR after liver transplantation. DISCLOSURE STATEMENT: The authors declare no conflicts of interest FUNDING: The work was partially supported by REDINREN RD16/0009/0027 from ISCIII-FEDER Corresponding author: David San Segundo, Mail address: Avd Valdecilla s/n. Servicio de Inmunología. Hospital Universitario Marqués de Valdecilla., Torre B, planta -1, 39008 Santander (Cantabria) SPAIN. Email: david.sansegundo@scsalud.es Copyright © 2019 Wolters Kluwer Health, Inc. All rights reserved.
Bone metabolism impairment in heart transplant: results from a prospective cohort study
Background: Data on the prevention of fractures after heart transplant (HTx) are controversial in the literature. Understanding the effects of HTx on bone may guide appropriate treatments in this high-risk population. Methods: Seventy adult HTx patients were followed for 12 months. Clinical and laboratory parameters, bone mineral density (BMD), microarchitecture and vertebral fractures were assessed at baseline (after intensive care unit discharge) and at 6 and 12 months. Patients received recommendations regarding calcium intake and vitamin D supplementation after HTx. Results: At baseline, 27% of patients had osteoporosis, associated with the length of hospitalization before HTx (p=0.001). BMD decreased in the first 6 months, with partial recovery later. Bone microarchitecture deteriorated, mainly in the trabecular bone in the first 6 months and cortical bone in the subsequent 6 months. At baseline, 92.9% of patients had vitamin D levels <30 ng/mL and 20.0% <10 ng/mL. Patients also had calcium at the lower limit of normal, high alkaline phosphatase, and high bone resorption biomarker. These abnormalities were suggestive of impaired bone mineralization and normalized at 6 months with correction of vitamin D deficiency. The majority of vertebral fractures were identified at baseline (23% of patients). After multivariate analyses, only a lower fat mass persisted as a risk factor for vertebral fractures (OR 1.23, 95% CI 1.04-1.47, p=0.012). Conclusions: High frequencies of densitometric osteoporosis, vitamin D deficiency, bone markers abnormalities and vertebral fractures were observed shortly after HTx. Calcium and vitamin D supplementation should be the first step in correcting bone mineralization impairment before specific osteoporosis treatment. DISCLOSURE STATEMENT: The authors declare no conflicts of interest. FUNDING: This study was not sponsored by any pharmaceutical company. It was supported by grants from Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP 2014/21239-4 to FGMB) and Conselho Nacional do Desenvolvimento Científico e Tecnológico (CNPq 305556/2017-7 to RMRP). Address for correspondence: Luis Fernando Bernal da Costa Seguro, Av. Dr. Enéas Carvalho de Aguiar, 44 – Núcleo de Transplantes, 2º andar InCor, São Paulo, Brazil, Zip code: 05403-000, luisseguro@yahoo.com.br Copyright © 2019 Wolters Kluwer Health, Inc. All rights reserved.
Intraoperative hypertension and thrombocytopenia associated with intracranial hemorrhage after liver transplantation
Background: Intracranial hemorrhage (ICH) is a devastating complication. Although hypertension and thrombocytopenia are well-known risk factors for ICH in the general population, their roles in ICH after liver transplantation (LT) have not been well established. Methods: We performed a retrospective study and hypothesized that intraoperative hypertension and thrombocytopenia were associated with posttransplant ICH. New onset of spontaneous hemorrhage in the central nervous system within 30 days after LT were identified by reviewing radiologic reports and medical records. Risk factors were identified by multivariate logistic regression. Receiver operating characteristic analysis and Youden index were used to find the cutoff value with optimal sensitivity and specificity. Results: Of 1836 adult patients undergoing LT at UCLA, 36 (2.0%) developed ICH within 30 days after LT. Multivariate logistic regression demonstrated that intraoperative mean arterial pressure (MAP) ≥ 105 mmHg (≥ 10 minutes) (odds ratio (OR) 6.5, 95% CI 2.7-7.7, p<0.001) and platelet counts ≤ 30 x109/L (OR 3.3, 95% CI 14-7.7, p=0.006) were associated with increased risk of postoperative ICH. Preoperative total bilirubin ≥ 7 mg/dL was also a risk factor. Thirty-day mortality in ICH patients was 48.3%, significantly higher compared with the non-ICH group (3.0%, p <0.001). Patients with all three risk factors had a 16% chance of developing ICH. Conclusions: In the current study, postoperative ICH was uncommon, but associated with high mortality. Prolonged intraoperative hypertension and severe thrombocytopenia were associated with postoperative ICH. More studies are warranted to confirm our findings and develop a strategy to prevent this devastating posttransplant complication. The authors declare no conflicts of interest Financial disclosure: None Correspondence author: Victor W. Xia, M.D., Department of Anesthesiology, Ronald Reagan UCLA Medical Center, David Geffen School of Medicine, University California of Los Angeles, 757 Westwood Plaza, Suite 3325, Los Angeles, CA, 90095 United States of America, Email: vxia@mednet.ucla.edu Copyright © 2019 Wolters Kluwer Health, Inc. All rights reserved.
FRAILTY AS A PREDICTOR OF MORTALITY IN PATIENTS WITH INTERSTITIAL LUNG DISEASE REFERRED FOR LUNG TRANSPLANTATION
Background: Frailty is a clinically recognized syndrome of decreased physiological reserve and a key contributor to suboptimal clinical outcomes in various lung disease groups. Interstitial lung disease (ILD) is fast approaching chronic obstructive pulmonary disease (COPD) as the number one indication for lung transplantation worldwide. Our aim was to assess whether frailty is a predictor of mortality in patients with interstitial lung disease referred for lung transplantation in an Australian cohort. Methods: Consecutive patients with ILD referred or on the waiting list for lung transplantation from May 2013 to December 2017 underwent frailty assessment using the modified Fried Frailty Phenotype (FFP). Frailty was defined as a positive response to three or more of the following five components; weak grip strength, slowed walking speed, poor appetite, physical inactivity and exhaustion. Results: 100 patients (82M: 18F; age 59 ± 7 years, range 30 – 70) underwent frailty assessment. 24 / 100 (24%) were assessed as frail. Frailty was associated with anemia, hypoalbuminemia, low creatinine and the use of supplemental oxygen (all p<0.05). Frailty was independent of age, gender, measures of pulmonary dysfunction (PaO2, FVC % predicted, TLC, TLC % predicted, DLCO or DLCO % predicted), cognitive impairment or depression. Frailty and DLCO % predicted were independent predictors of increased all-cause mortality: one-year actuarial survival was 86% ± 4% in the nonfrail group compared with 58% ± 10% for the frail group (p = 0.002) . Conclusions: Frailty is common among patients referred for lung transplant with a diagnosis of ILD and is associated with a marked increase in mortality. Disclosure statement: The authors declare no conflicts of interest. Funding: No funding was received for this work. Address for Correspondence: Ms Elyn Montgomery, Faculty of Health, University of Technology Sydney, PO Box 123, Broadway, NSW 2007, AUSTRALIA. Phone: (Mobile) +61420 390 984, Email: elynmont@gmail.com; Elyn.Montgomery@uts.edu.au Copyright © 2019 Wolters Kluwer Health, Inc. All rights reserved.
Recipient HLA-C7 and Protection from Polyoma Virus Nephropathy
No abstract available

Δεν υπάρχουν σχόλια:

Δημοσίευση σχολίου

Αρχειοθήκη ιστολογίου