Racial disparity in kidney transplant survival relates to late rejection and is independent of steroid withdrawal


Black kidney transplant recipients have more acute rejection (AR) and inferior graft survival. We sought to determine whether early steroid withdrawal (ESW) had an impact on AR and death‐censored graft loss (DCGL) in blacks. From 2006 to 2012, AR and graft survival were analyzed in 483 kidney recipients (208 black and 275 non‐black). Rates of ESW were similar between blacks (65%) and non‐blacks (67%). AR was defined as early (≤3 months) or late (>3 months). The impact of black race, early AR, and late AR on death‐censored graft failure was analyzed using univariate and multivariate Cox models. Blacks had greater dialysis vintage, more deceased donor transplants, and less HLA matching, yet rates of early AR were comparable between blacks and non‐blacks. However, black race was a risk factor for late AR (HR: 3.48 (95% CI: 1.87‐6.47)) Blacks had a greater rate of DCGL, partially driven by late AR (HR with late AR: 5.6; 95% CI: 3.3‐9.3). ESW had no significant interaction with black race for risk of early AR, late AR, or DCGL. Independent of ESW, black kidney recipients had a higher rate of late AR after kidney transplantation. Late AR was highly predictive of DCGL and contributed to inferior graft survival in blacks


Technology in Geriatrics


Recently, the interest of industry, government agencies and healthcare professionals in technology for aging people has increased. The challenge is whether technology may play a role in enhancing independence and quality of life and in reducing individual and societal costs of caring. Information and communication technologies, i.e. tools aimed at communicating and informing, assistive technologies designed to maintain older peoples’ independence and increasing safety, and human–computer interaction technologies for supporting older people with motility and cognitive impairments as humanoid robots, exoskeletons, rehabilitation robots, service robots and companion-type are interdisciplinary topics both in research and in clinical practice. The most promising clinical applications of technologies are housing and safety to guarantee older people remaining in their own homes and communities, mobility and rehabilitation to improve mobility and gait and communication and quality of life by reducing isolation, improve management of medications and transportation. Many factors impair a broad use of technology in older age, including psychosocial and ethical issues, costs and fear of losing human interaction. A substantial lack of appropriate clinical trials to establish the clinical role of technologies to improve physical or cognitive performances and/or quality of life of subjects and their caregivers may suggest that the classical biomedical research model may not be the optimal choice to evaluate technologies in older people. In conclusion, successful technology development requires a great effort in interdisciplinary collaboration to integrate technologies into the existing health and social service systems with the aim to fit into the older adults’ everyday life.

Changes in exhaled 13CO2/12CO2 breath delta value as an early indicator of infection in intensive care unit patients

BACKGROUND We have developed a new, noninvasive predictive marker for onset of infection in surgical intensive care unit (ICU) patients. The exhaled 13CO2/12CO2 ratio, or breath delta value (BDV), has been shown to be an early marker for infection in a proof of concept human study and in animal models of bacterial peritonitis. In these studies, the BDV changes during onset and progression of infection, and these changes precede physiological changes associated with infection. Earlier diagnosis and treatment will significantly reduce morbidity, mortality, hospitalization costs, and length of stay. The objective of this prospective, observational, multicenter study was to determine the predictive value of the BDV as an early diagnostic marker of infection.

METHODS Critically ill adults after trauma or acute care surgery with an expected length of stay longer than 5 days were enrolled. The BDV was obtained every 4 hours for 7 days and correlated to clinical infection diagnosis, serum C-reactive protein, and procalcitonin levels. Clinical infection diagnosis was made by an independent endpoint committee. This trial was registered at the US National Institutes of Health (ClinicalTrials.gov) NCT02327130.

RESULTS Groups were demographically similar (n = 20). Clinical infection diagnosis was confirmed on day 3.9 ± 0.63. Clinical suspicion of infection (defined by SIRS criteria and/or new antibiotic therapy) was on day 2.1 ± 0.5 in all infected patients. However, 5 (56%) of 9 noninfected subjects also met clinical suspicion criteria. The BDV significantly increased by 1‰ to 1.7‰ on day 2.1 after enrollment (p < 0.05) in subjects who developed infections, while it remained at baseline (± 0.5‰) for subjects without infections.

CONCLUSION A BDV greater than 1.4‰ accurately differentiates subjects who develop infections from those who do not and predicts the presence of infection up to 48 hours before clinical confirmation. The BDV may predict the onset of infection and aid in distinguishing SIRS from infection, which could prompt earlier diagnosis, earlier appropriate treatment, and improve outcomes.

LEVEL OF EVIDENCE Diagnostic test, level III.

Optimal Prevention of Dysplasia Requires Complete Ablation of All Intestinal Metaplasia

This meta-analysis demonstrated that the rate of recurrent dysplasia was doubled when residual Barrett epithelium remained after endoscopic ablation.

Current guidelines suggest that endoscopic ablation be offered to patients with confirmed dysplasia in a segment of Barrett esophagus (BE). The authors of this meta-analysis examined long-term outcomes (almost 13,000 patient follow-up years) after ablation of dysplastic BE, comparing the 86% of patients who had complete remission of intestinal metaplasia with the 14% of patients who had eradication of dysplasia but with persistent metaplasia.

Dysplasia recurred in 5% of those with complete ablation of all metaplasia versus 12% of those who had ablation of dysplasia with persistent metaplasia. The development of high-grade dysplasia or cancer was also twice as likely when metaplasia persisted (3% vs.6%)


This important insight into the management of BE patients after ablation makes it clear that the goal should be complete eradication of all Barrett metaplasia, which decreases the risk for recurrent dysplasia and, more importantly, for developing high-grade dysplasia or cancer. Careful follow-up and retreatment of any persistent metaplasia is part of the eradication process. BE can also recur after ablation, which likewise increases the risk for an adverse outcome. Thus, regular surveillance is mandatory. Finally, earlier study findings suggest that high-dose proton-pump inhibitor therapy is another important component in preventing BE recurrence. This detailed process must be followed if optimal outcomes are to be achieved.


Physical Activity and Incidence of Heart Failure in Postmenopausal Women

Physical Activity and Incidence of Heart Failure in Postmenopausal Women


Objectives This study prospectively examined physical activity levels and the incidence of heart failure (HF) in 137,303 women, ages 50 to 79 years, and examined a subset of 35,272 women who, it was determined, had HF with preserved ejection fraction (HFpEF) and HF with reduced EF (HFrEF).


Background The role of physical activity in HF risk among older women is unclear, particularly for incidence of HFpEF or HFrEF.


Methods Women were free of HF and reported ability to walk at least 1 block without assistance at baseline. Recreational physical activity was self-reported. The study documented 2,523 cases of total HF, and 451 and 734 cases of HFrEF and HFpEF, respectively, during a mean 14-year follow-up.


Results After controlling for age, race, education, income, smoking, alcohol, hormone therapy, and hysterectomy status, compared with women who reported no physical activity (reference group), inverse associations were observed across incremental tertiles of total physical activity for overall HF (hazard ratio [HR]: Tertile 1 = 0.89, Tertile 2 = 0.74, Tertile 3 = 0.65; trend p < 0.001), HFpEF (HR: 0.93, 0.70, 0.68; p < 0.001), and HFrEF (HR: 0.81, 0.59, 0.68; p = 0.01). Additional controlling for potential mediating factors included attenuated time-varying coronary heart disease (CHD) (nonfatal myocardial infarction, coronary revascularization) diagnosis but did not eliminate the inverse associations. Walking, the most common form of physical activity in older women, was also inversely associated with HF risks (overall: 1.00, 0.98, 0.93, 0.72; p < 0.001; HFpEF: 1.00, 0.98, 0.87, 0.67; p < 0.001; HFrEF: 1.00, 0.75, 0.78, 0.67; p = 0.01). Associations between total physical activity and HF were consistent across subgroups, defined by age, body mass index, diabetes, hypertension, physical function, and CHD diagnosis. Analysis of physical activity as a time-varying exposure yielded findings comparable to those of baseline physical activity.


Conclusions Higher levels of recreational physical activity, including walking, are associated with significantly reduced HF risk in community-dwelling older women.

Retrospective evaluation of the efficacy and safety of belatacept with thymoglobulin induction and maintenance everolimus: A single‐center clinical experience


Belatacept use has been constrained by higher rates of acute rejection. We hypothesized that belatacept with low‐dose rATG and initial mycophenolate maintenance with conversion to everolimus at 1 month post‐transplant ± corticosteroids would improve efficacy and maintain safety. Retrospective single‐center analysis of the first 44 low immunologic risk kidney transplant recipients treated with this regimen. The cohort was 59% male, mean age at transplant of 57 years. Diabetes was the most common cause of ESRD (39%). The mean 1‐year eGFR was 61.4 (SD 18.4) mL/min/1.73 m2. There were five acute cellular rejections (11.4%) that occurred in patients who had changed from everolimus to mycophenolate mofetil due to side effects. Thirty‐two percent developed BK viremia and 12% developed CMV viremia. There were no cases of PTLD. A novel belatacept regimen with rATG induction and maintenance everolimus demonstrated a low acute rejection rate and maintained an excellent 1‐year eGFR.

Kidney allograft failure in the steroid‐free immunosuppression era: A matched case‐control study


We studied the causes and predictors of death‐censored kidney allograft failure among 1670 kidney recipients transplanted at our center in the corticosteroid‐free maintenance immunosuppression era. As of January 1, 2012, we identified 137 recipients with allograft failure; 130 of them (cases) were matched 1‐1 for recipient age, calendar year of transplant, and donor type with 130 recipients with functioning grafts (controls). Median time to allograft failure was 29 months (interquartile range: 18‐51). Physician‐validated and biopsy‐confirmed categories of allograft failure were as follows: acute rejection (21%), glomerular disease (19%), transplant glomerulopathy (13%), interstitial fibrosis tubular atrophy (10%), and polyomavirus‐associated nephropathy (7%). Graft failures were attributed to medical conditions in 21% and remained unresolved in 9%. Donor race, donor age, human leukocyte antigen mismatches, serum creatinine, urinary protein, acute cellular rejection, acute antibody‐mediated rejection, BK viremia, and CMV viremia were associated with allograft failure. Independent predictors of allograft failure were acute cellular rejection (odds ratio: 18.31, 95% confidence interval: 5.28‐63.45) and urine protein ≥1 g/d within the first year post‐transplantation (5.85, 2.37‐14.45). Serum creatinine ≤1.5 mg/dL within the first year post‐transplantation reduced the odds (0.29, 0.13‐0.64) of allograft failure. Our study has identified modifiable risk factors to reduce the burden of allograft failure.

Comparison of outcomes of kidney transplantation from donation after brain death, donation after circulatory death, and donation after brain death followed by circulatory death donors



There are three categories of deceased donors of kidney transplantation in China, donation after brain death (DBD), donation after circulatory death (DCD), and donation after brain death followed by circulatory death (DBCD) donors. The aim of this study was to compare the outcomes of kidney transplantation from these three categories of deceased donors.


We retrospectively reviewed 469 recipients who received deceased kidney transplantation in our hospital from February 2007 to June 2015. The recipients were divided into three groups according to the source of their donor kidneys: DBD, DCD, or DBCD. The primary endpoints were delayed graft function (DGF), graft loss, and patient death.


The warm ischemia time was much longer in DCD group compared to DBCD group (18.4 minutes vs 12.9 minutes, P < .001). DGF rate was higher in DCD group than in DBD and DBCD groups (22.5% vs 10.2% and 13.8%, respectively, P = .021). Urinary leakage was much higher in DCD group (P = .049). Kaplan‐Meier analysis showed that 1‐, 2‐, and 3‐year patient survivals were all comparable among the three groups.


DBCD kidney transplantation has lower incidences of DGF and urinary leakage than DCD kidney transplant. However, the overall patient and graft survival were comparable among DBD, DCD, and DBCD kidney transplantation.

Outcome of kidney transplant in primary, repeat, and kidney‐after‐nonrenal solid‐organ transplantation: 15‐year analysis of recent UNOS database


The number of nonrenal solid‐organ transplants increased substantially in the last few decades. Many of these patients develop renal failure and receive kidney transplantation. The aim of this study was to evaluate patient and kidney allograft survival in primary, repeat, and kidney‐after‐nonrenal organ transplantation using national data reported to United Network for Organ Sharing (UNOS) from January 2000 through December 2014. Survival time for each patient was stratified into the following: Group A (comparison group)—recipients of primary kidney transplant (178 947 patients), Group B—recipients of repeat kidney transplant (17 819 patients), and Group C—recipients of kidney transplant performed after either a liver, heart, or lung transplant (2365 patients). We compared survivals using log‐rank test. Compared to primary or repeat kidney transplant, patient and renal allograft survival was significantly lower in those with previous nonrenal organ transplant. Renal allograft and patient survival after liver, heart, or lung transplants are comparable. Death was the main cause of graft loss in patients who had prior nonrenal organ transplant.

Consistent success in life-supporting porcine cardiac xenotransplantation


Heart transplantation is the only cure for patients with terminal cardiac failure, but the supply of allogeneic donor organs falls far short of the clinical need.

Xenotransplantation of genetically modified pig hearts has been discussed as a potential alternative4. Genetically multi-modified pig hearts that lack galactose-α1,3-galactose epitopes (α1,3-galactosyltransferase knockout) and express a human membrane cofactor protein (CD46) and human thrombomodulin have survived for up to 945 days after heterotopic abdominal transplantation in baboons5. This model demonstrated long-term acceptance of discordant xenografts with safe immunosuppression but did not predict their life-supporting function. Despite 25 years of extensive research, the maximum survival of a baboon after heart replacement with a porcine xenograft was only 57 days and this was achieved, to our knowledge, only once6. Here we show that α1,3-galactosyltransferase-knockout pig hearts that express human CD46 and thrombomodulin require non-ischaemic preservation with continuous perfusion and control of post-transplantation growth to ensure long-term orthotopic function of the xenograft in baboons, the most stringent preclinical xenotransplantation model. Consistent life-supporting function of xenografted hearts for up to 195 days is a milestone on the way to clinical cardiac xenotransplantation7.

%d bloggers like this: