Home

Friday, July 26, 2013

One-year results of a prospective, randomized, trial comparing two machine perfusion devices used for kidney preservation

Transplant International One-year results of a prospective, randomized, trial comparing two machine perfusion devices used for kidney preservation

Studies have shown beneficial effects of machine perfusion (MP) on early kidney function and long-term graft survival. The aim of this study was to investigate whether the type of perfusion device could affect outcome of transplantation of deceased donor kidneys. A total of 50 kidneys retrieved from 25 donors were randomized to machine perfusion using a flow driven (FD) device (RM3; Waters Medical Inc) or a pressure driven (PD) device (LifePort; Organ Recovery Systems), 24 of these kidneys (n=12 pairs; 48%) were procured from expanded criteria donors (ECD). The primary endpoints were kidney function after transplantation defined using the incidence of delayed graft function (DGF), the number of hemodialysis sessions required, graft function at 12 months, and analyses of biopsy. DGF was similar in both groups (32%; 8/25). Patients with DGF in the FD group required a mean of 4.66 hemodialysis sessions versus 2.65 in the PD group (p=0.005). Overall 1-year graft survival was 80% (20/25) versus 96% (24/25) in the FD and PD groups. One-year graft survival of ECD kidneys was 66% (8/12) in the FD group versus 92% (11/12) in the PD group. Interstitial fibrosis and tubular atrophy were significantly more common in the FD group – 45% (5/11) versus 0% (0/9) (p=0.03) in PD group. There were no differences in creatinine levels between the groups.Machine perfusion using a pressure driven device generating lower pulse stress is superior to a flow driven device with higher pulse stress for preserving kidney function. This article is protected by copyright. All rights reserved.


http://onlinelibrary.wiley.com/resolve/doi?DOI=10.1111%2Ftri.12169

Sent with Reeder


Thursday, July 25, 2013

The Survival Benefit of Kidney Transplantation in Obese Patients

AJT - Early The Survival Benefit of Kidney Transplantation in Obese Patients

Obese patients have a decreased risk of death on dialysis but an increased risk of death after transplantation, and may derive a lower survival benefit from transplantation. Using data from the United States between 1995 and 2007 and multivariate non-proportional hazards analyses we determined the relative risk of death in transplant recipients grouped by body mass index (BMI) compared to wait-listed candidates with the same BMI (n = 208 498). One year after transplantation the survival benefit of transplantation varied by BMI: Standard criteria donor transplantation was associated with a 48% reduction in the risk of death in patients with BMI ≥ 40 kg/m2 but a ≥66% reduction in patients with BMI < 40 kg/m2. Living donor transplantation was associated with ≥66% reduction in the risk of death in all BMI groups. In sub-group analyses, transplantation from any donor source was associated with a survival benefit in obese patients ≥50 years, and diabetic patients, but a survival benefit was not demonstrated in Black patients with BMI ≥ 40 kg/m2. Although most obese patients selected for transplantation derive a survival benefit, the benefit is lower when BMI is ≥40 kg/m2, and uncertain in Black patients with BMI ≥ 40 kg/m2.


http://onlinelibrary.wiley.com/resolve/doi?DOI=10.1111%2Fajt.12331

Sent with Reeder

Wednesday, July 24, 2013

The effect of donor-recipient gender mismatch on short- and long-term graft survival in kidney transplantation: a systematic review and meta-analysis

Clinical Transplantation The effect of donor-recipient gender mismatch on short- and long-term graft survival in kidney transplantation: a systematic review and meta-analysis

Background There is no limitation of gender matching in renal transplantation. This study was intended to evaluate its effect on short- and long-term graft survival. Methods PubMed, the Web of Knowledge, Medline, the Cochrane Library, and two additional Chinese databases were searched. The data were then abstracted and meta-analyzed. Results 14 studies involving 445 279 patients were included. Each study reported data on the four gender matches (male donor-male recipient, MDMR; male donor-female recipient, MDFR; female donor-male recipient, FDMR; female donor-female recipient, FDFR). The pooled risk ratios (RRs) for 0.5-, 1-, 2-, 3-, 5-, and 10-yr graft survival rates showed that the FDMR group had the worst outcomes, and when recipients were female, short-term graft survival was worse, but long-term graft survival was better. The differences between groups changed with time. Conclusions FDMR patients showed poor graft survival. The female recipients had worse short-term graft survival but the best long-term graft survival. This study introduces an important consideration into donor-recipient matching in renal transplantation.


http://onlinelibrary.wiley.com/resolve/doi?DOI=10.1111%2Fctr.12191

Sent with Reeder


Analysis of Anti-HLA Antibodies in Sensitized Kidney Transplant Candidates Subjected to Desensitization with Intravenous Immunoglobulin and Rituximab

Transplantation - Most Popular Articles Analysis of Anti-HLA Antibodies in Sensitized Kidney Transplant Candidates Subjected to Desensitization with Intravenous Immunoglobulin and Rituximab

Collapse Box

Background: Preexisting donor-specific antibodies against human leukocyte antigens are major risk factors for acute antibody-mediated and chronic rejection of kidney transplant grafts. Immunomodulation (desensitization) protocols may reduce antibody concentration and improve the success of transplant. We investigated the effect of desensitization with intravenous immunoglobulin and rituximab on the antibody profile in highly sensitized kidney transplant candidates.

Methods: In 31 transplant candidates (calculated panel-reactive antibody [cPRA], 34%–99%), desensitization included intravenous immunoglobulin on days 0 and 30 and a single dose of rituximab on day 15. Anti–human leukocyte antigen antibodies were analyzed before and after desensitization.

Results: Reduction of cPRA from 25% to 50% was noted for anti–class I (5 patients, within 20–60 days) and anti–class II (3 patients, within 10–20 days) antibodies. After initial reduction of cPRA, the cPRA increased within 120 days. In 24 patients, decrease in mean fluorescence intensity of antibodies by more than 50% was noted at follow-up, but there was no reduction of cPRA. Rebound occurred in 65% patients for anti–class I antibodies at 350 days and anti–class II antibodies at 101 to 200 days. Probability of rebound effect was higher in patients with mean fluorescence intensity of more than 10,700 before desensitization, anti–class II antibodies, and history of previous transplant.

Conclusions: The desensitization protocol had limited efficacy in highly sensitized kidney transplant candidate because of the short period with antibody reduction and high frequency of rebound effect.




http://journals.lww.com/transplantjournal/Fulltext/2013/07270/Analysis_of_Anti_HLA_Antibodies_in_Sensitized.12.aspx

Sent with Reeder

Tuesday, July 23, 2013

Donor-Derived Trypanosoma cruzi Infection in Solid Organ Recipients in the United States, 2001-2011.

AJT Donor-Derived Trypanosoma cruzi Infection in Solid Organ Recipients in the United States, 2001-2011.

Although Trypanosoma cruzi, the parasite that causes Chagas disease, can be transmitted via organ transplantation, liver and kidney transplantation from infected donors may be feasible. We describe the outcomes of 32 transplant recipients who received organs from 14 T. cruzi seropositive donors in the United States from 2001 to 2011. Transmission was confirmed in 9 recipients from 6 donors, including 3 of 4 (75%) heart transplant recipients, 2 of 10 (20%) liver recipients and 2 of 15 (13%) kidney recipients. Recommended monitoring posttransplant consisted of regular testing by PCR, hemoculture, and serology. Thirteen recipients had no or incomplete monitoring; transmission was confirmed in five of these recipients. Four of the five recipients had symptomatic disease and all four died although death was directly related to Chagas disease in only one. Nineteen recipients had partial or complete monitoring for T. cruzi infection with weekly testing by PCR, hemoculture and serology; transmission was confirmed in 4 of 19 recipients with no cases of symptomatic disease. Our results suggest that liver and kidney transplantation from T. cruzi seropositive donors may be feasible when the recommended monitoring schedule for T. cruzi infection is followed and prompt therapy with benznidazole can be administered.



http://www.unboundmedicine.com/medline/citation/23837488/Donor_Derived_Trypanosoma_cruzi_Infection_in_Solid_Organ_Recipients_in_the_United_States_2001_2011_


Friday, July 19, 2013

Kidney Allograft Survival After Acute Rejection, the Value of Follow-Up Biopsies

AJT - Early Kidney Allograft Survival After Acute Rejection, the Value of Follow-Up Biopsies

Kidney allografts are frequently lost due to alloimmunity. Still, the impact of early acute rejection (AR) on long-term graft survival is debated. We examined this relationship focusing on graft histology post-AR and assessing specific causes of graft loss. Included are 797 recipients without anti-donor antibodies (DSA) at transplant who had 1 year protocol biopsies. 15.2% of recipients had AR diagnosed by protocol or clinical biopsies. Compared to no-AR, all histologic types of AR led to abnormal histology in 1 and 2 years protocol biopsies, including more fibrosis + inflammation (6.3% vs. 21.9%), moderate/severe fibrosis (7.7% vs. 13.5%) and transplant glomerulopathy (1.4% vs. 8.3%, all p < 0.0001). AR were associated with reduced graft survival (HR = 3.07 (1.92–4.94), p < 0.0001). However, only those AR episodes followed by abnormal histology led to reduced graft survival. Early AR related to more late alloimmune-mediated graft losses, particularly transplant glomerulopathy (31% of losses). Related to this outcome, recipients with AR were more likely to have new DSA class II 1 year posttransplant (no-AR, 11.1%; AR, 21.2%, p = 0.039). In DSA negative recipients, early AR often leads to persistent graft inflammation and increases the risk of new DSA II production. Both of these post-AR events are associated with increased risk of graft loss.


http://onlinelibrary.wiley.com/resolve/doi?DOI=10.1111%2Fajt.12370


Association of Metabolic Syndrome With Kidney Function and Histology in Living Kidney Donors

AJT - Early Association of Metabolic Syndrome With Kidney Function and Histology in Living Kidney Donors

The selection of living kidney donors is based on a formal evaluation of the state of health. However, this spectrum of health includes subtle metabolic derangements that can cluster as metabolic syndrome. We studied the association of metabolic syndrome with kidney function and histology in 410 donors from 2005 to 2012, of whom 178 donors were systematically followed after donation since 2009. Metabolic syndrome was defined as per the NCEP ATPIII criteria, but using a BMI > 25 kg/m2 instead of waist circumference. Following donation, donors received counseling on lifestyle modification. Metabolic syndrome was present in 50 (12.2%) donors. Donors with metabolic syndrome were more likely to have chronic histological changes on implant biopsies than donors with no metabolic syndrome (29.0% vs. 9.3%, p < 0.001). This finding was associated with impaired kidney function recovery following donation. At last follow-up, reversal of metabolic syndrome was observed in 57.1% of donors with predonation metabolic syndrome, while only 10.8% of donors developed de novo metabolic syndrome (p < 0.001). In conclusion, metabolic syndrome in donors is associated with chronic histological changes, and nephrectomy in these donors was associated with subsequent protracted recovery of kidney function. Importantly, weight loss led to improvement of most abnormalities that define metabolic syndrome.


http://onlinelibrary.wiley.com/resolve/doi?DOI=10.1111%2Fajt.12369

Conversion From Twice-Daily Tacrolimus Capsules to Once-Daily Extended-Release Tacrolimus (LCPT): A Phase 2 Trial of Stable Renal Transplant Recipients

Transplantation - Current Issue Conversion From Twice-Daily Tacrolimus Capsules to Once-Daily Extended-Release Tacrolimus (LCPT): A Phase 2 Trial of Stable Renal Transplant Recipients

imageBackground: LCP-Tacro is an extended-release formulation of tacrolimus designed for once-daily dosing. Phase 1 studies demonstrated greater bioavailability to twice-daily tacrolimus capsules and no new safety concerns. Methods: In this phase 2 study, adult stable kidney transplant patients on tacrolimus capsules (Prograf) twice-daily were converted to tacrolimus tablets (LCP-Tacro) once-daily; patients continued on LCP-Tacro once-daily for days 8 to 21; trough levels were to be maintained between 5 and 15 ng/mL; 24-hr pharmacokinetic assessments were done on days 7 (baseline pre-switch), 14, and 21. Results: Forty-seven patients completed LCP-Tacro dosing per protocol. The mean conversion ratio was 0.71. Pharmacokinetic data demonstrated consistent exposure (AUC) at the lower conversion dose. Cmax (P=0.0001), Cmax/Cmin ratio (P<0.001), percent fluctuation (P<0.0001), and swing (P=0.0004) were significantly lower and Tmax significantly (P<0.001) longer for LCP-Tacro versus Prograf. AUC24 and Cmin correlation coefficients after 7 and 14 days of therapy were 0.86 or more, demonstrating a robust correlation between LCP-Tacro tacrolimus exposure and trough levels. There were three serious adverse events; none were related to study drug and all were resolved. Conclusions: Stable kidney transplant patients can be safely converted from Prograf twice-daily to LCP-Tacro. The greater bioavailability of LCP-Tacro allows for once-daily dosing and similar (AUC) exposure at a dose approximately 30% less than the total daily dose of Prograf. LCP-Tacro displays flatter kinetics characterized by significantly lower peak-trough fluctuations.


http://journals.lww.com/transplantjournal/Fulltext/2013/07270/Conversion_From_Twice_Daily_Tacrolimus_Capsules_to.13.aspx


Thursday, July 18, 2013

Cost-Effectiveness of Hand-Assisted Retroperitoneoscopic Versus Standard Laparoscopic Donor Nephrectomy: A Randomized Study

Transplantation - Current Issue Cost-Effectiveness of Hand-Assisted Retroperitoneoscopic Versus Standard Laparoscopic Donor Nephrectomy: A Randomized Study

imageBackground: Live kidney donation has a clear economical benefit over dialysis and deceased-donor transplantation. Compared with mini-incision open donor nephrectomy, laparoscopic donor nephrectomy (LDN) is considered cost-effective. However, little is known on the cost-effectiveness of hand-assisted retroperitoneoscopic donor nephrectomy (HARP). This study evaluated the cost-effectiveness of HARP versus LDN. Methods: Alongside a randomized controlled trial, the cost-effectiveness of HARP versus LDN was assessed. Eighty-six donors were included in the LDN group and 82 in the HARP group. All in-hospital costs were recorded. During follow-up, return-to-work and other societal costs were documented up to 1 year. The EuroQol-5D questionnaire was administered up to 1 year postoperatively to calculate quality-adjusted life years (QALYs). Results: Mean total costs from a healthcare perspective were $8935 for HARP and $8650 for LDN (P=0.25). Mean total costs from a societal perspective were $16,357 for HARP and $16,286 for LDN (P=0.79). On average, donors completely resumed their daytime jobs on day 54 in the HARP group and on day 52 in the LDN group (P=0.65). LDN resulted in a gain of 0.005 QALYs. Conclusions: Absolute costs of both procedures are very low and the differences in costs and QALYs between LDN and HARP are very small. Other arguments, such as donor safety and pain, should determine the choice between HARP and LDN.


http://journals.lww.com/transplantjournal/Fulltext/2013/07270/Cost_Effectiveness_of_Hand_Assisted.10.aspx

Sent with Reeder

Low-Grade Proteinuria and Microalbuminuria in Renal Transplantation

Transplantation - Current Issue Low-Grade Proteinuria and Microalbuminuria in Renal Transplantation

image Nephrotic-range proteinuria has been known for years to be associated with poor renal outcome. Newer evidence indicates that early (1–3 months after transplantation) low-grade proteinuria and microalbuminuria (1) provide information on the graft in terms of donor characteristics and ischemia/reperfusion injury, (2) may occur before the development of donor-specific antibodies, (3) predict the development of diabetes and cardiovascular events, and (4) are associated with reduced long-term graft and patient survivals. Low-grade proteinuria and microalbuminuria are also predictive of diabetes, cardiovascular morbidity, and death in nontransplanted populations, which may help us to understand the pathophysiology of low-grade proteinuria or microalbuminuria in renal transplantation. The impact of immunosuppressive medications, including mammalian target of rapamycin inhibitors, on graft survival is still discussed, and the effect on proteinuria is crucial to the debate. The fact that chronic allograft rejection may exist as early as 3 months after renal transplantation indicates that optimal management of low-grade proteinuria or microalbuminuria should occur very early after transplantation to improve long-term renal function and the overall outcome of renal transplant recipients. The presence of low-grade proteinuria or microalbuminuria early after transplantation must be taken into account to choose adequate immunosuppressive and antihypertensive medications. Limited information exists regarding the benefit of therapeutic interventions to reduce low-grade proteinuria or microalbuminuria. Whether renin angiotensin blockade results in optimal nephroprotection in patients with low-grade proteinuria or microalbuminuria is not proven, especially in the absence of chronic allograft nephropathy. Observational studies and randomized clinical trials yield conflicting results. Finally, randomized clinical trials are urgently needed.


http://journals.lww.com/transplantjournal/Fulltext/2013/07270/Low_Grade_Proteinuria_and_Microalbuminuria_in.4.aspx

Sent with Reeder

Monday, July 15, 2013

Immunogenicity of Quadrivalent Human Papillomavirus Vaccine in Organ Transplant Recipients.

AJT Immunogenicity of Quadrivalent Human Papillomavirus Vaccine in Organ Transplant Recipients.

Solid organ transplant recipients are at risk of morbidity from human papillomavirus (HPV)-related diseases. Quadrivalent HPV vaccine is recommended for posttransplant patients but there are no data on vaccine immunogenicity. We determined the immunogenicity of HPV vaccine in a cohort of young adult transplant patients. Patients were immunized with three doses of quadrivalent HPV vaccine containing viral types 6, 11, 16 and 18. Immunogenicity was determined by type-specific viral-like protein ELISA. Four weeks after the last dose of vaccine, a vaccine response was seen in 63.2%, 68.4%, 63.2% and 52.6% for HPV 6, 11, 16 and 18, respectively. Factors that led to reduced immunogenicity were vaccination early after transplant (p = 0.019), having a lung transplant (p = 0.007) and having higher tacrolimus levels (p = 0.048). At 12 months, there were significant declines in antibody titer for all HPV types although the number of patients who remained seropositive did not significantly differ. The vaccine was safe and well tolerated. We show suboptimal immunogenicity of HPV vaccine in transplant patients. This is important for counseling patients who choose to receive this vaccine. Further studies are needed to determine an optimal HPV vaccine type and schedule for this population.



http://www.unboundmedicine.com/medline/citation/23837399/Immunogenicity_of_Quadrivalent_Human_Papillomavirus_Vaccine_in_Organ_Transplant_Recipients_

Sent with Reeder

Association Between Calcineurin Inhibitor Treatment and Peripheral Nerve Dysfunction in Renal Transplant Recipients.

AJT Association Between Calcineurin Inhibitor Treatment and Peripheral Nerve Dysfunction in Renal Transplant Recipients.

Neurotoxicity is a significant clinical side effect of immunosuppressive treatment used in prophylaxis for rejection in solid organ transplants. This study aimed to provide insights into the mechanisms underlying neurotoxicity in patients receiving immunosuppressive treatment following renal transplantation. Clinical and neurophysiological assessments were undertaken in 38 patients receiving immunosuppression following renal transplantation, 19 receiving calcineurin inhibitor (CNI) therapy and 19 receiving a calcineurin-free (CNI-free) regimen. Groups were matched for age, gender, time since transplant and renal function and compared to normal controls (n = 20). The CNI group demonstrated marked differences in nerve excitability parameters, suggestive of nerve membrane depolarization (p < 0.05). Importantly, there were no differences between the two CNIs (cyclosporine A or tacrolimus). In contrast, CNI-free patients showed no differences to normal controls. The CNI-treated patients had a higher prevalence of clinical neuropathy and higher neuropathy severity scores. Longitudinal studies were undertaken in a cohort of subjects within 12 months of transplantation (n = 10). These studies demonstrated persistence of abnormalities in patients maintained on CNI-treatment and improvement noted in those who were switched to a CNI-free regimen. The results of this study have significant implications for selection, or continuation, of immunosuppressive therapy in renal transplant recipients, especially those with pre-existing neurological disability.



http://www.unboundmedicine.com/medline/citation/23841745/Association_Between_Calcineurin_Inhibitor_Treatment_and_Peripheral_Nerve_Dysfunction_in_Renal_Transplant_Recipients_

Sent with Reeder

Proteinuria and Outcome After Renal Transplantation: Ratios or Fractions?

Transplantation - Most Popular Articles Proteinuria and Outcome After Renal Transplantation: Ratios or Fractions?

imageBackground: Proteinuria is associated with poorer outcomes in renal transplant recipients. Fractional excretion of total protein (FEPR) may better reflect kidney damage than urine protein-to-creatinine ratio (PCR). Methods: We assessed FEPR (FEPR = [serum creatinine × urine protein] / [serum protein × urine creatinine], %) and PCR ([urinary protein/urinary creatinine] × 1000, mg/mM) 1 year after first renal transplantation as predictors of transplant failure. The primary endpoints were transplant failure and death. The use of the tests was analyzed by constructing receiver operator characteristic curves and comparing the area under the curve. Using receiver operator characteristic analysis, patients were stratified into high- and low-risk groups. Results: Two hundred nineteen recipients were followed up for a median of 4.9 years. At a median of 2.7 years, 11.4% (n=25) of the transplants failed. Eight percent (n=17) of the patients died. The area under the curve was higher for FEPR than PCR (0.92 vs. 0.84). Patients with an FEPR of 0.019% or higher had a 3.4-fold (P=0.003) increased risk of transplant failure and a 2.3-fold (P=0.02) increased risk of death compared with those with an FEPR of less than 0.019%. Patients with a PCR of 97 mg/mM or greater had a 2.1-fold (P=0.04) increased risk of transplant failure and a 1.6-fold (P=0.04) increased risk of death compared with those with a PCR of less than 97 mg/mM (P=0.04). In multivariate analysis with time to transplant failure as the dependent variable, FEPR and PCR were independent predictors of transplant failure (hazards ratio, 1.07 [P=0.013] and 1.03 [P=0.03], respectively). Conclusions: FEPR and PCR at 1 year are independent predictors of transplant failure, but FEPR may be superior.


http://journals.lww.com/transplantjournal/Fulltext/2013/07150/Proteinuria_and_Outcome_After_Renal.12.aspx