Δευτέρα 21 Οκτωβρίου 2019

Regional prescription surveillance of phosphate binders in the western Saitama area: the substantial role of ferric citrate hydrate in improving serum phosphorus levels and erythropoiesis
In the Original publication, Under the table 1, the number of participants in the April has been incorrectly published as 1373. The corrected table is given below.

ADPKD and metformin: from bench to bedside

Risk factors and incidence of malignant neoplasms after kidney transplantation at a single institution in Japan

Abstract

Background

The risk of malignant neoplasms increases in kidney transplantation (KT) recipients (KTRs). However, Japanese registry studies have not been reported since 2000.

Methods

We retrospectively reviewed the medical records of 346 patients who underwent KT at Gifu University Hospital, Japan since 2000. Patients were divided into two groups based on whether they developed malignancy after KT or not. The incidence, type of malignancy, risk factors, and prognosis for malignancy were examined.

Results

In this study, 22 de novo malignant neoplasms were identified in 20 KTRs (7.3%), with a median follow-up period of 8.2 years. Cumulative incidence of any malignant neoplasms was 1.1% within 1 year and 4.4% within 5 years. The 5-year overall survival (OS) rates were 71.8% in KTRs with malignant neoplasms and 98.6% in KTRs without malignant neoplasms. Uni- and multivariate analysis revealed that age at KT and acute rejection (AR) episode were significant predictors for incidence of malignancy.

Conclusions

The development of malignant neoplasms was associated with poor OS and graft survival. We consider that appropriate screening and investigation of symptoms are important for KTRs, particularly for older KTRs at transplantation and those with AR episode.

Development and validation of a new prediction model for graft function using preoperative marginal factors in living-donor kidney transplantation

Abstract

Background

Recently, living-donor kidney transplantation from marginal donors has been increasing. However, a simple prediction model for graft function including preoperative marginal factors is limited. Here, we developed and validated a new prediction model for graft function using preoperative marginal factors in living-donor kidney transplantation.

Methods

We retrospectively investigated 343 patients who underwent living-donor kidney transplantation at Kyushu University Hospital (derivation cohort). Low graft function was defined as an estimated glomerular filtration rate of < 45 mL/min/1.73 m2 at 1 year. A prediction model was developed using a multivariable logistic regression model, and verified using data from 232 patients who underwent living-donor kidney transplantation at Tokyo Women's Medical University Hospital (validation cohort).

Results

In the derivation cohort, 89 patients (25.9%) had low graft function at 1 year. Donor age, donor-estimated glomerular filtration rate, donor hypertension, and donor/recipient body weight ratio were selected as predictive factors. This model demonstrated modest discrimination (c-statistic = 0.77) and calibration (Hosmer–Lemeshow test, P = 0.83). Furthermore, this model demonstrated good discrimination (c-statistic = 0.76) and calibration (Hosmer–Lemeshow test, P = 0.54) in the validation cohort. Furthermore, donor age, donor-estimated glomerular filtration rate, and donor hypertension were strongly associated with glomerulosclerosis and atherosclerotic vascular changes in the “zero-time” biopsy.

Conclusions

This model using four pre-operative variables will be a simple, but useful guide to estimate graft function at 1 year after kidney transplantation, especially in marginal donors, in the clinical setting.

New measures against chronic kidney diseases in Japan since 2018

Abstract

Since 2008, the Japanese government has started measures against chronic kidney disease (CKD), and steady changes have been achieved, including a decrease in the age-adjusted rate of new dialysis introduction. However, the total number of dialysis patients has not declined because of the progression of aging. Therefore, the Japanese government has started more concrete measures since 2018. It aims to prevent CKD exacerbation mainly by early referrals to nephrologists using “criteria for referral from a primary care physician to a kidney specialist/specialized medical institution”. In addition, key performance indicators are set to reduce the number of new dialysis patients from 39,000 in 2016 to ≤ 35,000 by 2028. We hope that you can refer to this measure all over the world and proceed with CKD measures. This report has been originally notified from the Ministry of Health, Labor and Welfare in Japanese. This is the English version of it.

Variations in actual practice patterns and their deviations from the clinical practice guidelines for nephrotic syndrome in Japan: certified nephrologists’ questionnaire survey

Abstract

Background

Few good-quality clinical trials on adults with nephrotic syndrome exist. Thus, there are discrepancies between real-world practice and clinical practice guidelines. We conducted a questionnaire-based survey to investigate potential discrepancies and the factors associated with variations in clinical practice.

Methods

A questionnaire was administered electronically to all board-certified nephrologists in Japan. To examine clinical practice variations in relation to physician characteristics, we estimated the ratio of the mean duration of steroid therapy using a generalized linear model, and the odds ratio of higher level ordinal variables using an ordered logistic regression model.

Results

Responses of the 116 participants showed some variation for the majority of questions. Most participants (94.8%) indicated that screening for malignant tumors was “Conducted for almost all patients”. The duration of steroid therapy was found to be longer among physicians seeing ≥ 30 patients with nephrotic syndrome per month, both for minimal-change disease (ratio of mean 1.69; 95% CI 1.07–2.66) and membranous nephropathy (ratio of mean 1.71; 95% CI 1.09–2.69).

Conclusions

We identified practice patterns for nephrotic syndrome and discrepancies between clinical practice guidelines and actual practice. Defining the standard therapy for nephrotic syndrome may be necessary to generate high-quality evidence and develop clinical guidelines.

Increased community-acquired upper urinary tract infections caused by extended-spectrum beta-lactamase-producing Escherichia coli in children and the efficacy of flomoxef and cefmetazole

Abstract

Background

Urinary tract infections caused by extended-spectrum beta-lactamase-producing bacteria are increasing worldwide. At our hospital, the number of pediatric patients hospitalized because of an upper urinary tract infection has dramatically increased since 2016. In total, 60.5% of urinary tract infections are caused by extended-spectrum beta-lactamase-producing Escherichia coli. Such a high prevalence of extended-spectrum beta-lactamase-producing E. coli has not been detected previously in Japan. Therefore, we evaluated the clinical and bacteriologic characteristics and efficacy of antibiotics against upper urinary tract infections caused by E. coli in children.

Methods

This retrospective study surveyed 152 patients who were hospitalized in the pediatric department of Shimane Prefectural Central Hospital because of upper urinary tract infections caused by E. coli. Medical records were reviewed to examine patient characteristics. O antigens, antibiotic susceptibility, gene typing, and pulse-field gel electrophoresis were studied at the Shimane Prefectural Institute of Public Health and Environmental Science.

Results

Urine sample analyses showed extended-spectrum beta-lactamase types such as CTX-M-9 and plural virulence genes. We changed the primary antibiotic treatment to flomoxef or cefmetazole to treat upper urinary tract infections caused by Gram-negative bacilli. After changing treatment, the time to fever alleviation was significantly shortened.

Conclusion

Extended-spectrum beta-lactamase-producing E. coli should be suspected in community-acquired upper urinary tract infections. Therefore, when treating patients, it is necessary to focus on antibiotic susceptibility and the prevalence of extended-spectrum beta-lactamase-producing bacteria found in each area. Flomoxef and cefmetazole are useful primary treatments for upper urinary tract infections caused by extended-spectrum beta-lactamase-producing E. coli.

Nondipping heart rate and associated factors in patients with chronic kidney disease

Abstract

Background

Nondipping heart rate (NHR) is a condition reported to be associated with cardiovascular events and cardiovascular mortality recently. We aimed to search whether there is difference among hypertensive patients with and without chronic kidney disease (CKD) in terms of NHR pattern and the factors associated with NHR in patients with CKD.

Methods

The study included 133 hypertensive patients with normal kidney functions, 97 hypertensive patients with predialysis CKD, and 31 hypertensive hemodialysis patients. Heart rate, blood pressure and pulse wave velocity (PWV) were measured by 24-h ambulatory blood pressure monitorization. NHR was defined as a decrease of less than 10% at night mean heart rate when compared with daytime values.

Results

NHR pattern was established as 26.3% in non-CKD hypertensive group, 43.3% in predialysis group and 77.4% in dialysis group. Among patients with CKD, when NHR group was compared with dipper heart rate group, it was seen that they were at older age, there were higher prevalence of diabetes mellitus and more female sex, and while the value of urea, creatinine, phosphorus, intact parathyroid hormone, and PWV were significantly higher, the value of hemoglobin, albumin and calcium were significantly lower. By multivariate analysis, hemoglobin [odds ratio (OR) 0.661; 95% CI 0.541–0.806; p < 0.001] and PWV (OR 1.433; 95% CI 1.107–1.853; p = 0.006) were established as independent determinants of NHR pattern.

Conclusions

NHR pattern is significantly more frequently seen in hypertensive CKD patients than in hypertensive patients with non-CKD. Anemia and increased arterial stiffness are seen independently associated with NHR in CKD patients.

Prevalence of chronic kidney disease among HIV-1-infected patients receiving a combination antiretroviral therapy

Abstract

Background

Chronic kidney disease (CKD) has become one of the most frequent non-infectious comorbidities in the aging HIV-infected population on long-standing combination antiretroviral therapy (cART).

Methods

We conducted a retrospective, cross-sectional study including HIV-infected adult patients attending our HIV outpatient clinic during the years 2017 and 2018 to assess prevalence and associated risk factors of CKD. Estimated glomerular filtration rate (eGFR) was measured by Chronic Kidney Disease-Epidemiology Collaboration (CKD-EPI) equation. CKD was diagnosed and classified according to the National Kidney Foundation guidelines. Logistic regression was employed to identify factors associated with CKD.

Results

We enrolled 2339 HIV-infected patients (91% were Caucasian) with a mean age of 45.3 years and a mean current CD4 lymphocyte count of 531 cells/mm3. CKD was diagnosed in 311 subjects (13.3%). Overall, 294 (12.6%) patients had albuminuria, 108 (4.6%) had eGFR < 60 mL/min/1.73 m2, and 78 (3.3%) had albuminuria plus eGFR < 60 mL/min/1.73 m2. Stages 4–5 of CKD were documented in 23 (1%) cases. Age greater than 50 years, male gender, hypertension, diabetes mellitus, high triglycerides, nadir CD4 cell count < 200 cells/mm3, current use of tenofovir disoproxyl fumarate (TDF) and of TDF plus a ritonavir-boosted protease inhibitors were independently associated with CKD, while current use of abacavir plus one integrase inhibitor was associated with a reduced risk of CKD.

Conclusion

There is a significant prevalence of CKD among HIV-infected persons in association with both traditional and HIV-specific risk factors, requiring a careful periodic monitoring of renal function in these patients.

Acute effect of a peritoneal dialysis exchange on electrolyte concentration and QT interval in uraemic patients

Abstract

Background

Hemodialysis (HD) sessions induce changes in plasma electrolytes that lead to modifications of QT interval, virtually associated with dangerous arrhythmias. It is not known whether such a phenomenon occurs even during peritoneal dialysis (PD). The aim of the study is to analyze the relationship between dialysate and plasma electrolyte modifications and QT interval during a PD exchange.

Methods

In 15 patients, two manual PD 4-h exchanges were performed, using two isotonic solutions with different calcium concentration (Ca++1.25 and Ca1.75++ mmol/L). Dialysate and plasma electrolyte concentration and QT interval (ECG Holter recording) were monitored hourly. A computational model simulating the ventricular action potential during the exchange was also performed.

Results

Dialysis exchange induced a significant plasma alkalizing effect (p < 0.001). Plasma K+ significantly decreased at the third hour (p < 0.05). Plasma Na+ significantly decreased (p < 0.001), while plasma Ca++ slightly increased only when using the Ca 1.75++ mmol/L solution (p < 0.01). The PD exchange did not induce modifications of clinical relevance in the QT interval, while a significant decrease in heart rate (p < 0.001) was observed. The changes in plasma K+ values were significantly inversely correlated to QT interval modifications (p < 0.001), indicating that even small decreases of K+ were consistently paralleled by small QT prolongations. These results were perfectly confirmed by the computational model.

Conclusions

The PD exchange guarantees a greater cardiac electrical stability compared to the HD session and should be preferred in patients with a higher arrhythmic risk. Moreover, our study shows that ventricular repolarization is extremely sensitive to plasma K+ changes, also in normal range.

Δεν υπάρχουν σχόλια:

Δημοσίευση σχολίου

Αρχειοθήκη ιστολογίου