Secondary Logo

Share this article on:

The Transplantation Society of Australia and New Zealand Annual Scientific Meeting, Brisbane Convention Centre, 7th-9th May, 2017

doi: 10.1097/TXD.0000000000000732
Abstracts
Back to Top | Article Outline

Transplant Complications

INCIDENCE AND RISK FACTORS FOR INTRA-ABDOMINAL COMPLICATIONS FOLLOWING PAEDIATRIC RENAL TRANSPLANT

TAHER Amir1, ZHU Benjamin2, MA Sophia2, and DURKAN Anne3,4

1The Children's Hospital at Westmead, Sydney, Australia, 2Centre for Kidney Research, The Children's Hospital at Westmead, Sydney, Australia, 3Centre for Kidney Research, University of Sydney, 4Department of Renal Medicine, The Children's Hospital at Westmead, Sydney, Australia

Introduction: Complications following renal transplantation and their subsequent management remain major sources of morbidity with limited data in the paediatric population.

Methods: A retrospective review of all paediatric transplants, from 1995 to 2016 was undertaken. Intra-abdominal complications were grouped into fluid collections, gastrointestinal, vascular and urogenital categories. Donor, recipient, and transplant characteristics were evaluated using univariate and multivariate logistic regression.

Results: There were 163 transplants performed in 154 patients. Seventeen transplants were excluded due to lack of follow up. There were 89 (61%) males with mean follow up time of 4.6 ± 3.7 years. The mean weight at transplant was 31.5 ± 16.5kg and 16% were less than 15kg. Thirty-four (23%) patients had had previous abdominal surgery. The mean age at transplant was 9.9 ± 5.0 years. There were 32 complications identified in 27 (19%) of the included 146 transplants.

Fluid collections, requiring surgical drainage, developed in 9 (6.2%). Twelve (8.2%) developed gastrointestinal complications (bowel obstructions, incisional hernias, ileus and volvulus). There were 5 (3.5%) vascular complications. Urogenital complications (ureteric obstruction and stenosis, urethral obstruction and a calcified JJ stent) occurred in 6 patients (4.1%). There were 3 graft losses all following renal vein thrombosis. Data from univariate analysis is seen in table 1. Risk factors in the multivariate analysis were weight <15kg (p = 0.003) and previous abdominal surgery (p = 0.008).

Table

Table

Conclusion: Surgical complications occur in almost 1 in 5 paediatric renal transplant recipients. Weight below <15kg and previous abdominal surgery are risk factors for developing intra-abdominal complications.

Back to Top | Article Outline

THE SIGNIFICANCE OF BILIRUBIN. RISK FACTORS FOR THE DEVELOPMENT OF BILIARY STRICTURES POST LIVER TRANSPLANT IN QUEENSLAND, AUSTRALIA: A RETROSPECTIVE COHORT STUDY

FORREST Elizabeth1,2, REILING Janske3,4,5,6, LIPKA Geraldine1, and FAWCETT J3,1,7

1Queensland Liver Transplant Service, Princess Alexandra Hospital, Brisbane, 2Resident Medical Officer, Gold Coast Hospital and Health Service, 3School of Medicine, University of Queensland, Brisbane, 4Other, ANZDATA, 5Other, Greenslopes Private Hospital, Brisbane, Australia, 6Department of Surgery, NUTRIM - School for Nutrition and Translational Research in Metabolism. Maastricht University, Maastricht, the Netherlands, 7PA Research Foundation, Princess Alexandra Hospital, Brisbane

Aims: Biliary stricture formation post liver transplantation is a frequent cause for morbidity and mortality. The aim of this study was to identify risk factors associated with the formation of biliary strictures post liver transplantation in the state of Queensland, Australia.

Methods: Data on liver donors and recipients in Queensland between 2005 and 2014 was obtained from an electronic patient data system. In addition, intra-operative and post-operative characteristics were collected and a logistical regression analysis was performed to evaluate their association with the development of biliary strictures.

Results: Of 296 liver transplants performed, biliary strictures developed in 45 (15.2%) recipients. Anastomotic stricture formation (n=25, 48.1%) was the commonest complication, with 14 (58.3%) of these occurred within 6 months of transplant. A percutaneous approach or ERCP was used to treat 17 patients with biliary strictures. Biliary reconstruction was initially or ultimately required in 22 patients. In recipients developing biliary complications, bilirubin was significantly increased within the first post-operative week (Day 7 Bilirubin 74μmol/L versus 49μmol/L, p=0.012). In both univariate and multivariate regression analysis, Day 7 bilirubin >55μmol/L was associated with the development of biliary stricture formation (Table 1). Hepatic artery thrombosis and primary sclerosing cholangitis were identified as independent risk factors.

Conclusions: This study demonstrated an overall 15% biliary stricture rate following transplantation. In addition to known risk factors for biliary strictures post liver transplant, bilirubin levels in the early post-operative period could be used as a predictive tool indicating a need for suspicion of biliary stricture formation.

TABLE 1

TABLE 1

Back to Top | Article Outline

TOWARDS AN ANAL CANCER SCREENING PROGRAM FOR TRANSPLANT RECIPIENTS

ROSALES Brenda1, LANGTON LOCKTON Julian2, ROBERTS J3, TABRIZI SN4, GRULICH A5, HILLMAN RJ6,7, and WEBSTER Angela1,8

1School of Public Health, University of Sydney, 2Blue Mountains and Nepean Hospitals Sexual Health and HIV Clinic, 3Department of Microbiology and Infectious Diseases, Douglas Hanly Moir Pathology, Sydney, Australia, 4Department of Microbiology and Infectious Diseases, Regional HPV Labnet, 5, 6Western Sydney Sexual Health Centre, Westmead Hospital, Sydney, 7University of Sydney, 8Centre for Kidney Research, Westmead Hospital, Sydney

Aim: Solid organ transplant recipients have increased anal cancer rates, ten times higher than the general population. High risk human papillomavirus (HPV) genotypes cause the majority of anal cancers. We evaluated the presence of HPV and anal cytological changes in transplant recipients.

Methods: Renal transplant recipients attending Westmead Hospital, Sydney, were recruited from October 2014. Anal liquid-based Papanicolaou (Pap) tests were obtained and tested for cytological changes and the presence of HPV genotypes.

Results: Of 80 eligible participants approached, 50 (63%) consented to join the study. The mean age was 49 (median=47, range 20-76 years) and all were HIV negative. 40 (80%) of the 50 cytological samples were technically satisfactory. Of these, 5 (12.5%) were consistent with Low grade anal Squamous Intraepithelial Lesions (LSIL), 2 (5%) High grade anal Squamous Intraepithelial Lesions (HSIL) and 33 (82.5%) were negative. To date, 42 (84%) of the 50 specimens had been HPV genotyped, 39 (93%) of which were assessable. Of these, HPV DNA detected in 4 (10%), that is two high risk types (35 & 59) and two low risk types (62 & 81). The single sample with HSIL cytology had HPV type 62. Table 1 details the results of participants in whom both HPV and cytology were assessable.

Conclusions: Transplant recipients were generally willing to undergo anal swabbing. The rate of anal cytological abnormalities was substantially higher than that typically found in cervical screening programs, and testing for HPV could potentially identify further individuals at risk of anal cancer.

TABLE 1

TABLE 1

Back to Top | Article Outline

IDENTIFICATION OF RISK FACTORS FOR THROMBOSIS RATES IN SIMULTANEOUS PANCREAS AND KIDNEY TRANSPLANTATION

SPIKE Erin1, SHAHRESTANI Sara2,1, BLUNDELL Jian2, ANNINK Christopher E2, GIBBONS Thomas J2, YUEN Lawrence1, RYAN Brendan1, LAM Vincent1,2, ROBERTSON Paul3, PLEASS Henry2,1, and HAWTHORNE Wayne J2,1,4

1Department of Surgery, Westmead Hospital, Sydney, 2Western Clinical School, University of Sydney, 3National Pancreas Transplant Unit, Westmead Hospital, Sydney, 4Centre for Transplant and Renal Research, Westmead Institute for Medical Research

Introduction: Simultaneous pancreas and kidney (SPK) transplantation improves survival for patients suffering type 1 diabetes and end stage renal failure. Pancreatic allograft thrombosis is a leading cause of graft failure, and the main surgical complication, with thrombosis rates approaching 30% in some series. Previously identified risk factors for pancreatic graft thrombosis include donor obesity, increased duration of donor brain death, cardiovascular or cerebrovascular cause of death, and some types of arterial reconstruction.

Aims: To identify potential risk factors for early (<2 weeks) pancreatic allograft thrombosis after SPK transplant.

Methods: A retrospective analysis of all SPK transplants performed at Westmead Hospital between January 2010 and December 2015 was conducted. Data was collected regarding potential risk factors for early pancreatic allograft thrombosis. Risk factors examined included donor and recipient demographics, surgical and transplant-related data.

Results: One hundred and forty two SPK transplants were performed between January 2010 and December 2015. Early pancreatic allograft thrombosis occurred in 20 of 142 cases (14%). Graft thrombosis led to re-laparotomy in 10 of 142 cases (7%) with graft pancreatectomy required in 6 of 142 cases (4%). Average cold ischaemia time was longer in those with early graft thrombosis (mean: 732 minutes, median: 772 minutes) than those without (mean: 683 minutes, median: 670 minutes). Graft vascular reconstruction methods were similar between groups.

Conclusions: SPK transplantation provides improved survival and quality of life in patients with type 1 diabetes and end stage renal failure. However, it carries a significant technical failure rate. Prolonged cold ischaemia time is a potential risk factor for graft thrombosis.

Back to Top | Article Outline

THE SEROPREVALENCE OF HEPATITIS E IN A RENAL TRANSPLANT POPULATION

MCGINN Stella1, NEWCOMBE James2, WONG Bruce2, DARBAR Archie2, and KOTSIOU George2

1Renal Transplant Unit, Royal North Shore Hospital, Sydney, 2Department of Infectious Diseases, Royal North Shore Hospital, Sydney

Hepatitis E (HEV) is a non-enveloped RNA virus of which there are 4 genotypes. HEV can be transmitted by faecal oral route, blood transfusion or plasma exchange. HEV infection is often asymptomatic and self-limiting. However genotype 3 can proceed to chronic infection in over 60% of immunocompromised patients. The seroprevalence of HEV in solid organ transplants in Europe, North America and South America is 6-39% with a 1.0-2.3% demonstrating chronic infection. In Australia, the seroprevalence of HEV in blood donors is 6% but in renal transplant recipients is unknown.

Aim: To determine the seroprevalence of hepatitis E infection in a renal transplant unit.

Methods: A seroprevalence study of hepatitis E IgG was conducted in renal transplant recipients at Royal North Shore Hospital using ELISA on stored serum. Seropositive participants were surveyed for epidemiological risk factors and tested for chronic infection using Hepatitis E RNA PCR.

Results: The seroprevalence of hepatitis E was 8.1% in 74 patients with 1.4% having evidence of recent exposure post-transplant. No seropositive participants had evidence of chronic hepatitis E infection or significant liver disease. Genotype analysis was not available. Pork and shellfish consumption and blood transfusion were possible risk factors.

Conclusions: Renal transplant recipients demonstrate exposure to HEV but in this study no chronic infection was demonstrated. High risk food avoidance, pre-emptive hepatitis E testing and blood product screening may reduce the risk of hepatitis E infection in solid organ transplant recipients in Australia.

Back to Top | Article Outline

CARDIOVASCULAR DISEASE RISK FACTORS IN LIVER TRANSPLANT RECIPIENTS: HOW ARE THEY MANAGED?

MARSH Lauren1, MCDOWALL Kirsty2, and DICKINSON Kacie1

1Discipline of Nutrition and Dietetics, Flinders University, Adelaide, 2Department of Dietetics and Nutrition, Flinders Medical Centre, Adelaide

Aims: The aim of this study was to investigate liver transplant unit health care provider’s practices and barriers to patient management of post-transplant weight gain and cardiovascular health.

Methods: A 27-item electronic survey comprising of multiple-choice questions and free-text responses was distributed to health professionals working in liver transplant units in Australia, New Zealand, the United Kingdom and North America. The survey comprised of three components, the first focusing on respondent demographics and awareness of cardiovascular risk post-transplant, the second on patient management, guideline usage and follow-up and the third component focused on the practitioner’s opinions on management and barriers to practice.

Results: 37 completed survey responses were obtained finding a lack in evidence-based guideline usage with only 11 (29.7%) of respondents following evidence based practice guidelines to minimise cardiovascular disease risk factors. There was also a decrease in long-term post-transplant patient follow-up and monitoring of weight status and cardiovascular health. Few (13.5%) respondents agreed that their unit’s current management was optimal to prevent post-transplant weight gain and 6 (16.2%) agreed that practices were optimal to prevent post-transplant cardiovascular disease risk factors. 28 (75.7%) respondents identified time as a barrier to management, 27 (73.0%) staffing, 19 (51.4%) funding and 18 (48.6%) resources as a barrier.

Conclusions: While practitioners working with liver transplant patients are aware of cardiovascular disease risk factors post-transplant, many agree that management is suboptimal due to a number of barriers. Guideline usage and management of patients is limited and long-term follow-up and monitoring is unclear and should be explored further to address and reduce the burden of cardiovascular disease in this patient group long term.

Back to Top | Article Outline

OUTCOMES OF SCREENING FOR BK VIRAEMIA AND BK NEPHROPATHY IN RENAL TRANSPLANT RECIPIENTS: A SINGLE CENTRE COHORT STUDY

JAYASINGHE Kushani, GARRY Lorraine, GRACEY David, CHADBAN Steven, WAN Susan, and WYBURN Kate

Nephrology and Renal Transplant, Royal Prince Alfred Hospital, Sydney

Aims: There are limited Australian data regarding intermediate to long term outcomes in patients with BK viraemia. Effects of routine screening are not well established. Our aim was to describe the burden of BK viraemia (BKV) and BK nephropathy (BKN) in a large single transplant centre.

Methods: We conducted a retrospective cohort study of 526 patients transplanted between 2008 and 2015, when routine screening (at 3 and 12 months post transplantation) was established.

Results: 71 patients (13%) developed BKV and 20 (4%) developed BKN during the study period. The median follow-up was 50.9 months (IQR 28.4-82.2) with a minimum follow-up of 12 months. More than 95% of patients were screened for BK at 3 months. All but two patients with BKN had intermediate or high levels of BKV>1000copies/ml. The majority of patients had basiliximab induction (73%) and maintenance tacrolimus, mycophenolate and prednisolone (94%). Race was strongly associated with BKV (p<0.001), with Asian/Indian and Aboriginal/Pacific Islander groups both having an increased risk compared to Caucasian (OR Asian/Indian compared to Caucasian 2.49; 95%CI 1.20-5.16). There was a trend toward higher rates of BKV with increasing HLA mismatch (p=0.065). Acute rejection and thymoglobulin use were not associated with BKV (OR 1.16, p=0.751; OR 0.71, p=0.349 respectively).

Conclusions: BKV was detected in 13% of our cohort and Asian and Indian race was associated with a significantly increased risk. Routine screening for BKV is effective and enables optimal management of immunosuppression to minimise progression.

Back to Top | Article Outline

PRIMARY CENTRAL NERVOUS SYSTEM POST-TRANSPLANTATION LYMPHOPROLIFERATIVE DISORDER: A CASE SERIES OF RENAL TRANSPLANT RECIPIENTS

JAYASINGHE Kushani1, MANSER David1, WYBURN Kate1, GRACEY David1, TAI Edward2, ERIS Josette1, and CHADBAN Steven1

1Renal & Transplantation Unit, Royal Prince Alfred Hospital, Sydney, 2Nephrology and Renal Transplant, Wagga Wagga Base Hospital

Aim: Primary central nervous system post-transplantation lymphoproliferative disorder (PCNS-PTLD) is a rare complication of immunosuppresion. Data is limited to case series. We aimed to identify our cases of PCNS-PTLD and describe clinical and pathological features and outcomes.

Methods: Search of Unit database and ANZDATA records plus review of case records. We included recipients transplanted between 1990 and 2016 who were followed at our unit.

Results: Ten patients with PCNS-PTLD and no evidence of systemic PTLD were identified and examined. Median age was 52 (range 31-79). Median time to diagnosis after transplant was 13 years (range 1-36). Follow-up was complete for all but one patient (lost to followup since 2011). Of these, four died: one was palliated after diagnosis of PCNS-PTLD due to multiple comorbidities. Another withdrew treatment three years after diagnosis aged 82 due to frailty. A third nursing-home patient died three years following PTLD from aspiration pneumonia. The forth patient withdrew dialysis after a prolonged admission from presumed progressive multifocal leukoencephalopathy 16 years post PTLD. Three patients remained in remission at time of death. All six surviving patients remain in remission and four retain graft function with a median eGFR of 54ml/min/1.73m2 (range 51-63). Two patients incurred graft loss, one at the time of PTLD diagnosis and the second at twelve months caused by chronic allograft nephropathy, however the second was re-transplanted 7 years after curative therapy.

Conclusions: Over half PCNS-PTLD cases remain in remission, most of whom have ongoing graft function. Younger patients with less co-morbidity had better survival.

Back to Top | Article Outline

BK VIRURIA LEVELS WITHIN 3 MONTHS POST RENAL TRANSPLANTATION CAN PREDICT THE DEVELOPMENT OF BK VIRAEMIA

LIOUFAS Nicole1, MASTERSON Rosemary1,2, and HUGHES Peter1,2

1Department of Nephrology, Royal Melbourne Hospital, 2Department of Medicine, University of Melbourne

Aim: To determine the utility of screening for BK viruria (BKVR) to predict the development of BK viraemia within the first year of renal transplantation.

Background: BK virus infection affects 15% of renal transplants within the first year post transplantation. Preventative strategies involve screening for the development of BK viraemia using serum PCR and immunosuppression reduction following detection. There is little data regarding the predictive value of BKVR for the development of BK viraemia.

Method: A retrospective single centre cohort study of patients transplanted from January 2013 to October 2015 with a minimum of 12 months follow-up. Data collected included demographics, transplant characteristics and results of routine screening for BKVR and viraemia.

Results: 182 patients underwent BK viral screening. 62 patients (34%) had viruria, 40 viraemia (22%) and 30 (16%) had histological evidence of tubulitis/BK nephropathy (SV40 positive or recent rejection). Of those with BKVR, 40 patients (61%) developed viraemia within the first twelve months post transplantation (p<0.05), with 37 patients also positive for BKVR at 3 months (92.5%), which occurred at a median 26 days (10-34). BKVR had a greater sensitivity and specificity for the prediction of BK viraemia at 2 months (sensitivity 77%, specificity 95%, PPV 82% and NPV 93%) compared with one month (sensitivity 40%; specificity 90%; PPV 82%; NPV 86%).

Conclusion: BKVR preceded the development of BK viraemia, and its levels could be predictive of the subsequent development of viraemia.

Back to Top | Article Outline

LEFLUNOMIDE THERAPY FOR BK VIRUS INFECTION AMONG RENAL TRANSPLANTATION RECIPIENTS. CAN WE MOVE FORWARD?

JAMBOTI Jagadish S1,2, IRISH Ashley1,2,3, HO Sharon1, PUTTAGUNTA Harish1, and AUNG Nyi1

1Nephrology and Renal Transplant, Fiona Stanley Hospital, WA, 2School of Medicine & Pharmacology, University of Western Australia, Perth, 3WA Liver & Kidney Transplant Service

Aims: To evaluate the efficacy and safety of Leflunomide in the treatment of BK Virus Nephropathy (BKVN) and BK Viruria among Renal transplant recipients (RTR).

Methods: Retrospective analysis of data from a single centre of RTR treated with Leflunomide for biopsy proven BKVN or BK Viruria.

Results: 6 RTR (mean age 50 years; 2 males) received Leflunomide for periods ranging from 6-60 months, along with immunosuppression minimization for management of BKVN (n=3) or BK Viruria. All RTR with BKVN were sensitised re-transplants also had concurrent acute rejection and received pulse methyl prednisolone. IVIG was used in 2 RTR with BKVN. At diagnosis, the BKVN patients were switched to Leflunomide from Mycophenolate along with decreased dose of Tacrolimus or change over to Cyclosporin or Sirolimus (1 patient each). 3 RTRs had BK Viruria without evidence of BKVN and were switched from Mycophenolate to Leflunomide along with decreased dose of Tacrolimus. Lower doses of Leflunomide i.e., 10-20mg per day were utilised in all patients and was well tolerated. Side effects were rare and consisted of leukopenia (2 patients) and hypertension (1patient). Routine monitoring of metabolites was not done. All the 6 RTRs showed clearance/excellent reduction in Viruria/Viremia on treatment with Leflunomide with stable kidney allograft function (see table).

Conclusions: The use of Leflunomide, which has immunosuppressive and anti-BK Viral properties to replaceMycophenolate alongside global reduction in immunosuppression, was well tolerated and associated with reduction or clearance of BK viremia and viruria, whilst allograft kidney function remained stable. Randomized, controlled studies comparing Leflunomide with Mycophenolate in RTR are warranted.

Table

Table

Back to Top | Article Outline

BK VIRUS-ASSOCIATED NEPHROPATHY AND HAEMORRHAGIC CYSTITIS IN A LUNG TRANSPLANT RECIPIENT

VIECELLI AK1,2, CHAMBERS D3,2, OLIVER K4, and FRANCIS R1,2

1Department of Nephrology, Princess Alexandra Hospital, Brisbane, 2School of Medicine, University of Queensland, Brisbane, 3Queensland Centre for Pulmonary Transplantation and Vascular Disease, Prince Charles Hospital, Brisbane, 4Department of Pathology, Princess Alexandra Hospital, Brisbane

Background: BK polyomavirus-associated nephropathy (BKVAN) is a common cause of renal dysfunction in kidney transplant recipients. In non-renal solid organ transplantation, impaired renal function is typically attributed to calcineurin-inhibitor nephrotoxicity and BKVAN is not commonly reported despite the higher immunosuppressive burden.

Case report: We describe a 35-year-old male who developed BKVAN in his native kidneys five years after bilateral lung transplantation for cystic fibrosis. He had received induction with basiliximab and was maintained on immunosuppressive therapy with tacrolimus (trough levels 5-12μg/L), everolimus (trough levels 10-12μg/L), prednisolone (10mg daily) and mycophenolate mofetil (MMF, 1000mg twice daily) for chronic lung rejection. Despite dialysis-dependent acute kidney injury immediately post- transplantation, his renal function had stabilised at an estimated GFR of 70 to 80ml/min/1.73m2 (creatinine 80-90μmol/L). Nineteen months following investigation of intermittent haematuria and diagnosis of BKV-associated haemorrhagic cystitis, he presented with deteriorating renal function (estimated GFR of ~50ml/min/m2, creatinine 130-150μmol/L) in the absence of proteinuria, haematuria, or obstruction. BK viraemia (1.4x105 copies/mL) was noted and a native kidney biopsy confirmed the diagnosis of BKVAN with positive staining of tubular cells for SV40 and acute tubular necrosis. Despite stepwise reduction of his immunosuppressive therapy (MMF 500mg twice daily, prednisolone 7.5mg, and target trough levels of 5-7 copies/mL for tacrolimus and everolimus, respectively), BK viraemia and progressive renal impairment are currently ongoing.

Conclusion: This rare case of haemorrhagic cystitis and BKVAN in a lung transplant recipients highlights the need to consider BK virus reactivation in patients with non-renal solid organ transplants who develop renal impairment.

Back to Top | Article Outline

IDENTIFICATION OF POTENTIAL SOURCES OF BACTERIAL CONTAMINATION OF ORGAN PERFUSION FLUID IN SIMULTANEOUS PANCREAS KIDNEY (SPK) TRANSPLANTS

SHAHRESTANI Sara1,2, GOIRE Namraj2, ROBERTSON Paul3, KABLE Kathy3, DAVIES Sussan4, PLEASS Henry5, and HAWTHORNE Wayne4,5

1Western Clinical School, University of Sydney, 2School of Medicine, University of Sydney, 3Renal Transplant Unit, Westmead Hospital, Sydney, 4Centre for Transplant and Renal Research, The Westmead Institute, 5Department of Surgery, Westmead Hospital, Sydney

Aim: The assumption of sterility in organ retrieval operations is of particular importance given transplant recipients are heavily immunocompromised and susceptible to significant infections. The aims of the present work were to determine if the organ donor operation contributes to contamination of the donor organs, which are subsequently transplanted, to identify a means to reduce potential infection rates.

Methods: We performed a retrospective review of potential bacterial contamination of organ perfusion fluid from all kidney and pancreas transplants at Westmead Hospital, Sydney between January 2012 and June 2016. Rates of culture by standard methods and BACTEC were compared, as well as microbiological contamination in SPK versus kidney alone transplants and donation after cardiac death (DCD) versus brain death (DBD) transplants.

Results: 355 recipients of kidney, pancreas or SPK transplants were reviewed of which 258 had cultures performed on organ perfusion/transport media. Only 13.2% of organ transport media cultured on standard agar were positive whereas 66.7% of those cultured on BACTEC were positive for bacterial contaminants. Sources of organisms identified by BACTEC included skin contaminants (63%), enteric (16%), respiratory oral (14.7%) and environmental (7%). Contaminants of SPK transplants were twice as likely to be enteric (12%) compared with kidney alone transplants (6%). DCD transplants had a higher rate of skin flora contaminants (30%) relative to DBD transplants (18%).

Conclusion: Microbiological surveillance of transplant recipients should ideally start at the first point of contact. Identification of ways to reduce any potential for infection is paramount. While enteric contamination is more likely in SPK transplants, in DCD’s skin flora contamination is more likely. We will discuss implications and proposed strategies for reducing contamination.

Back to Top | Article Outline

CHANGE IN TASTE FUNCTION FOLLOWING KIDNEY AND LIVER TRANSPLANTATION

MURRAY Eryn1,2, INGRID Hickman1,3, MCCOY Simone1, and CAMPBELL Katrina1

1Nutrition and Dietetics, Princess Alexandra Hospital, Brisbane, 2School of Medicine, University of Queensland, Brisbane, 3Mater Research Institute, University of Queensland, Brisbane

Background: Taste dysfunction is documented in end-stage disease. Improvements in taste function following transplantation may facilitate an enhanced food experience and weight gain.

Aim: To measure change in taste function following kidney and liver transplantation.

Methods: Assessments were conducted prior to and at six months post-transplant, with cross sectional comparison to healthy controls (matched for age and BMI). Taste was assessed by International Standards Organisation (ISO) for five basic tastes (salty, sweet, sour, bitter, umami) using McNemar test (recognition) or paired t-test (sensitivity) for repeat measures of change and t-test or chi-Squared test for comparison against controls.

Results: 34 recipients (n=17 kidney; n=17 liver) were 47.2±15.8 years old, 62% male with BMI (mean±SD) 26.3±5.1kg/m2. Taste recognition was lower in advanced kidney and liver disease with a reduced ability to correctly identify all five tastes (12% vs 30%), and mean number correctly identified (3.2±1.1 versus 3.8±1.1, p=0.03) lower than controls. When tastes were considered separately, there was impaired recognition for sweet and bitter (Figure 1), that latter did not improve following transplant, and reduced sensitivity to umami (5.2±1.3 vs 6.2±2.0 p<0.05) in those with chronic liver disease. Overall, transplantation did not result in a significant improvement in taste.

Conclusion: Taste recognition is diminished in end-stage kidney and liver disease. While there does not appear to be a significant improvement in taste by six months post-transplant, other improvements in food hedonics (i.e. smell or dietary freedom) may explain reported improvements in the overall food experience.

FIGURE 1

FIGURE 1

Back to Top | Article Outline

EVOLUTION OF GLYCAEMIC CONTROL AND VARIABILITY AFTER KIDNEY TRANSPLANT

AOUAD Leyla1, CLAYTON Philip2,3, and CHADBAN Steven1,4

1Department of Renal Medicine, Royal Prince Alfred Hospital, Sydney, 2Renal & Transplantation Unit, Royal Adelaide Hospital, 3University of Adelaide, 4School of Medicine, University of Sydney

Background: Glycaemic changes are common after kidney transplant but the evolution of these changes has not been described. Our aim was to prospectively examine patterns of glycaemic control and variability over time from transplantation using continuous glucose monitoring (CGM).

Methods: Kidney transplant recipients were fitted with CGM devices for 3-5 days at time of transplant, at month 3 and month 6 post-transplant. Indices of glucose control (Mean glucose, percent time in hyperglycemic range and GRADE score) and variability (SD, CV and MAGE) were calculated. An OGTT was also performed at month 3.

Results: 28 patients (mean age 45±15yrs) were enrolled, 64% male and 75% Caucasian. Of 24 patients with usable CGM data at month 0, 3 had prior diabetes and 6 (25%) developed post-transplant diabetes (PTDM). Hyperglycaemia (>11.1mM) was evident in 79% immediately after transplant. Compared to patients without diabetes, those with prior diabetes had higher Mean glucose [7.8mM (95%CI 7.4-8.2) vs 9.9mM (95%CI 8.9-10.8),p<0.001], GRADE score [4.5(95%CI 3.7-5.4) vs 7.8 (95%CI 5.6-10.4),p=0.003] and percent time with hyperglycaemia. Indices were also higher in those that subsequently developed PTDM [Mean glucose: 7.8mM (95%CI 7.4-8.2) vs 8.8mM (95%CI 8.2-9.4),p=0.004, GRADE: 4.5 (95%CI 3.7-5.4) vs 6.2 (95%CI 5.2-7.7),p=0.04]. Glucose variability was increased in patients with prior diabetes [SD: 1.99 (95%CI 1.72-2.27) vs 2.97(95%CI 2.27-3.67),p=0.006] but not in those with PTDM. All measures of glucose control and variability significantly improved over time from transplantation (Table 1.)

Table

Table

Conclusions: Patients that develop PTDM demonstrate early poor glycaemic control without alterations in glycaemic variability. Glycaemic changes after transplant improve with time.

Table

Table

Back to Top | Article Outline

ENDOTHELIAL GLYCOCALYX BREAKDOWN PRODUCTS ARE BIOMARKERS FOR PRIMARY GRAFT DYSFUNCTION AFTER LUNG TRANSPLANTATION

SLADDEN Timothy1,2, YERKOVICH Stephanie2,1, HOPKINS Peter2, TAN Maxine2, and CHAMBERS Daniel2,1

1School of Medicine, University of Queensland, Brisbane, 2Queensland Centre for Pulmonary Transplantation and Vascular Disease, Prince Charles Hospital, Brisbane

Aims: Primary graft dysfunction (PGD) is a major cause of morbidity and mortality and is characterized by vascular leak. The endothelial glycocalyx (EG) is a meshwork of glycosaminoglycans, anchored to the endothelium, which provides the major barrier to trans-vascular fluid shifts. The EG is very sensitive to damage by ischaemia-reperfusion injury and inflammation. We hypothesised that during lung retrieval and transplant, damage occurs to the EG, resulting in PGD and poorer outcomes in transplant recipients. Our aim was to measure key EG breakdown products in the peripheral blood of transplant patients and relate to early outcomes.

Methods: Hyaluronan, heparan sulphate and syndecan-1 were measured by ELISA pre-transplant and daily post-transplant for 4 days. PGD scores were determined as per ISHLT guidelines.

Results: 53 patients undergoing bilateral lung transplant at The Prince Charles Hospital from 2013 to 2015 were studied (female 54.7%; age 48.6 IQR (38.4-61.2); PGD grade ≥2 at 24hrs (n=18, 34%), 72hrs (n=17, 32%)). Increased EG breakdown products, measured on day 1 to 4, were positively correlated with ventilation times and length of stay in ICU. Syndecan-1 was also significantly higher in patients with PGD grade ≥2 compared to those with PGD grade 0-1. Hyaluron was not associated with PGD.

Conclusions: EG dysfunction may underpin pulmonary vascular dysfunction in the immediate post-transplant period. Furthermore, EG breakdown products may be useful biomarkers for the diagnosis of PGD. Understanding the pathology behind EG shedding may lead to the development of new treatments for PGD.

Back to Top | Article Outline

DONOR SYNDECAN-1 LEVELS ARE ASSOCIATED WITH ORGAN REJECTION FOLLOWING KIDNEY TRANSPLANT

SLADDEN Timothy1,2, YERKOVICH Stephanie3,1, JAFFREY Lauren4, ISBEL Nicole4,1, and CHAMBERS Daniel2,1

1School of Medicine, University of Queensland, Brisbane, 2Queensland Centre for Pulmonary Transplantation and Vascular Disease, Prince Charles Hospital, Brisbane, 3Queensland Centre for Pulmonary Transplantation and Vascular Disease, University of Queensland, Brisbane, 4Department of Nephrology, Princess Alexandra Hospital, Brisbane

Aims: Delayed graft function (DGF) is a common cause of morbidity post-kidney transplant. Kidney reperfusion studies have demonstrated shedding of the endothelial glycocalyx (EG), a glycosaminoglycan structure on the luminal surface of blood vessels, can be associated with reduced graft function. We hypothesised that EG damage in organ donors may contribute to DGF in transplant recipients. Our aim was to measure EG breakdown products in the peripheral blood of organ donors and relate this to recipient outcomes.

Methods: Hyaluronan, heparan sulphate, syndecan-1 and CD44 were measured by ELISA from stored peripheral blood of Queensland kidney donors consented to research (2009-2015). Recipient outcomes were from the Princess Alexandria Hospital, Woolloongabba.

Results: 189 kidney donors had samples collected (Female 45.5%; mean age 44.1 (SD 17.2)yrs; DCD 30%; HTN 5.3%; DM 19.8%). 331 corresponding kidney recipients were studied (Female 40%; age 48.8 (SD 16)yrs; polycystic kidney disease 15.7%; DGF grade 4 23.6%; rejection at 1 month 11.8%; rejection ever 15.4%). There was no difference in any of the measured EG breakdown products and DGF grade. However, higher syndecan-1 levels were associated with an increased prevalence of rejection within the first month and rejection ever. This association was not evident with the other EG breakdown products measured.

Conclusion: While we found no association between EG breakdown in the donor and DGF in the recipient, increased syndecan-1 levels were associated with an increased risk of kidney rejection. We will be further analysing this relationship for longer term outcomes and kidney donor risk index associations.

Table

Table

Back to Top | Article Outline

POST-RENAL TRANSPLANT UROLITHIASIS IN AUSTRALIAN CHILDREN: 20-YEAR INCIDENCE AND ANALYSIS OF RISK FACTORS

MA Sophia1, TAHER Amir1, ZHU Benjamin1, and DURKAN Anne1,2

1Centre for Kidney Research, The Children's Hospital at Westmead, Sydney, 2Department of Pediatrics, University of Sydney

Introduction: Urolithiasis is increasingly common in both adults and children. In renal transplant(RTx) recipients, stones may cause obstruction, with potential loss of graft function. We sought to evaluate characteristics of urolithiasis post-RTx in a paediatric population.

Methods: All patients transplanted in our hospital between Nov1995-Oct 2016 were included if they had at least 12 months follow up data. A retrospective review of medical records and radiology images was performed. Data are expressed as mean(SD) or median(IQR).

Results: There were 163 RTx performed in 154 patients, of which 148 were eligible for inclusion. Stone(s) were identified in 8 (5.4%) children and 4 had more than one stone. No child had calculi pre-transplant. All stone-formers were male, with a median age at transplant of 4.5(6.2) years and weight 15.4(13.3)kgs. Seven received a living related kidney. Haematuria (n=5), pain (n=4) and UTI (n=2) were presenting features. Time-to-presentation was bimodal: 3 stones were identified in the first 3 months and the remainder 31–53 months post-RTx. Two stones were associated with persisting suture material. The mean stone size was 8.36 (4.4) mm. Five stones were analysed: all contained calcium-oxalate, three were mixed, including 1 with uric acid. Five children had elevated urinary calcium. The median urinary pH was 6.0(1.05) prior to diagnosis. Treatment included: cystolithotripsy(n=5), extracorporeal-shockwave-lithotripsy (n=1) and combined citrate therapy (n=4). No grafts were lost due to calculi.

Conclusion: In children, the incidence of stones post-transplant is increasing. Haematuria and lower abdominal pain should raise the index of clinical suspicion.

Back to Top | Article Outline

THE DIFFICULTIES OF RENAL TRANSPLANTATION IN PATIENTS WITH PRIOR MESH HERNIA REPAIR

GOH David1,2, and MICHELL Ian1

1Renal Transplant Unit, Austin Hospital, Melbourne, 2Department of Vascular Surgery, Austin Hospital, Melbourne

Aims: This study is to highlight difficulties in performing open renal transplant surgery in patients that have had previous extra-peritoneal mesh hernia repair. There are multiple papers that have discussed post renal transplantation hernia, however this is the first paper to discuss difficulties in renal transplantation post mesh hernia repair.

Methods: Retrospective case series analysis of two patients that underwent renal transplantation at The Austin Hospital in Melbourne who previously had extra-peritoneal mesh hernia repair. In one case the hernia repair was performed with synchronous Tenckhoff catheter placement for peritoneal dialysis.

Results: The first patient had a laparoscopic mesh inguinal hernia repair synchronously with Tenckhoff catheter insertion for peritoneal dialysis in early 2016. The renal transplant procedure was complicated by a difficult dissection that prolonged the operation time. Total operation time was 2h42m. The second patient had bilateral laparoscopic mesh inguinal hernia repair in 2010. The renal transplant procedure was complicated by a difficult dissection and difficulty identifying the bladder. Total operation time was 3h 1m. In both cases hernia repair was omitted from the medical history.

Conclusions: Extraperitoneal mesh hernia repair is a common operation. Our series demonstrates increased difficulty performing renal transplant in patients with mesh hernia repairs. We could not demonstrate increased risk of infection or other deleterious outcomes from prior mesh hernia repair. In our experience it is important to enquire about previous hernia repairs. We recommend careful consideration before performing synchronous hernia and Tenckhoff catheter placement in patients that may be suitable for future transplantation.

Back to Top | Article Outline

LONG-TERM GRAFT ABD PATIENT OUTCOMES OF KIDNEY TRANSPLANT RECIPIENTS WITH INCIDENT CANCER

LIM Wai1, CHAPMAN Jeremy2, and WONG Germaine2

1Sir Charles Gairdner Hospital, Perth, 2Westmead Hospital, Sydney

Aim: To determine the long-term graft and transplant outcomes in kidney transplant recipients with incident cancers.

Methods: Using data from the Australia and New Zealand Dialysis and Transplant (ANZDATA) registry, we determined the cancer stage at diagnoses and estimated the risk of overall graft loss, death with functioning graft and all cause mortality in kidney transplant recipients with and without incident cancer using adjusted Cox-regression analyses.

Results: Of 12,859 transplant recipients, 1184 (9.2%) developed cancers after transplantation. Digestive and kidney/urinary tract cancers were the most common cancer types, although digestive and respiratory tract cancers were more aggressive, with 40% reported as advanced cancers at time of cancer diagnosis. Compared with recipients with no prior cancer, the adjusted hazard ratios (HR) of recipients with incident cancers were 1.39 (95%CI 1.25, 1.44) for overall graft loss, and 2.92 (95%CI 2.56, 3.33) for death with a functioning graft. For all-cause mortality, the adjusted HR of recipients with incident cancers was 1.95 (95%CI 1.73, 2.19), with over 80% of deaths attributed to cancer.

Conclusion: Incident cancer after kidney transplantation is a significant risk factor for death with a functioning graft and all-cause mortality, with the majority of deaths attributed to cancer. A greater understanding of the barriers to screening and treatment approaches following cancer diagnosis may lead to improve survival in kidney transplant recipients with cancer.

Back to Top | Article Outline

EVOLUTION OF THERAPY FOR PTLD AFTER LUNG TRANSPLANTATION: SUCCESS AT LAST?

BENZIMRA Mark, MALOUF Monique, PEARSON Rebecca, RIGBY Amy, CALLIGARO Greg, ABELSON David, HAVRYK Adrian, PLIT Marshall, MOORE John, and GLANVILLE Allan

Lung Transplant Unit, St Vincent's Hospital, Sydney

Introduction: Lung transplant (LTX) recipients have a greater risk of post-transplant lymphoproliferative disease (PTLD) than other solid organ transplant recipients and we have reported previously the high mortality rate in this population. We reviewed our 30 year experience and compared outcomes of early therapies with recently available monoclonal antibody therapies.

Methods: Single centre, retrospective audit of LTX patients 1986-2016.

Results: 35/994 (0.35%) LTX recipients (24 male) (single: bilateral: heart lung = 3:26:6) (age 40 ± 16 years, mean ±SD), developed PTLD at 1165±1465, range 18–5134 days post LTX. Indications were cystic fibrosis (n=14), emphysema (n=7), interstitial lung disease (n=6), congenital heart disease (n=5) and other (n=3). Monomorphic diffuse B cell lymphoma was the most common histopathological finding (72%) and 15 had allograft involvement, 23 extra allograft with 2 bone marrow and 3 brain. 21/27 died from PTLD at 74±131 (0–490) days, 3/21 within 6 weeks despite initiation of monoclonal therapy while 2/21 achieved remission (375, 467 days). 6/27 died from other causes at 1274±1139 (113–2851) days while 8/35 survive in complete remission at 1842±1139 (250–4995) days, 6 of whom had anti-CD20 monoclonal antibody therapy in addition to reduction in immune suppression and antiviral prophylaxis.

Conclusion: In contrast to early therapies for PTLD after LTX, the transition to anti-CD20 monoclonal antibody therapy has been associated with complete remission particularly in recipients presenting with nodal disease. We conclude that early recognition of the protean manifestations of PTLD after LTX and prompt diagnosis may be associated with enduring remission with current therapies.

Back to Top | Article Outline

POST-TRANSPLANT LYMPHOPROLIFERATIVE DISEASE IN CHILDHOOD AND ADULTHOOD RECIPIENTS OF A KIDNEY TRANSPLANT

FRANCIS Anna1, JOHNSON David2,3, CRAIG Jonathan1, and WONG Germaine1

1Centre for Kidney Research, University of Sydney, 2Department of Nephrology, Princess Alexandra Hospital, Brisbane, 3School of Medicine, University of Queensland at the Princess Alexandra Hospital

Background: Post-transplant lymphoproliferative disease (PTLD) is a well-known complication of kidney transplantation but the long-term incidence, based upon age of transplant, is not well described.

Methods: Using the Australian and New Zealand Dialysis and Transplant Registry (1963–2013), the incidence of PTLD in all kidney transplant recipients was compared with population-based data using standardized incidence ratios (SIR), stratified into child (less than 20 years) or adult at the time of first transplant.

Results: Among 22,294 patients (92.2% adult, 60.1% male), 453 (2.0%) developed PTLD with 44 cases (9.7%) occurring in childhood recipients. The time to PTLD was similar for adult and childhood recipients, with a bimodal distribution peaking at 0–1 and 6–7 years (figure 1). The 25-year cumulative incidence of PTLD, adjusted for competing risk of death, was 3.2% (95% confidence intervals [CI] 2.9-3.5) for adult recipients and 3.5% (95%CI 2.5-4.7%) for child recipients. Childhood transplant recipients had a 30-fold increased risk compared to the general population (SIR 29.5, 95%CI 21.5-40.0), higher than for adult transplant recipients (SIR 8.5 95%CI 7.7-9.3). Transplantation during childhood (adjusted hazard ratio [aHR] 1.6, 95%CI 1.1-2.5), functioning graft (aHR 7.6, 95%CI 3.1-18.5), Epstein-Barr virus (EBV) seronegative status (aHR 2.6, 95%CI 1.9-3.6) and male gender (aHR 1.5, 95%CI 1.0-2.0) were associated with an increased risk of PTLD.

Conclusions: Lymphoproliferative disease in transplant recipients occurs at higher rates than in the general population, particularly in paediatric recipients. Childhood transplant recipients, males and EBV-negative patients are at increased risk of PTLD and it is more likely to occur during a functioning transplant.

FIGURE 1

FIGURE 1

Back to Top | Article Outline

RISK FACTORS FOR AND OUTCOMES OF DELAYED GRAFT FUNCTION IN LIVING DONOR KIDNEY TRANSPLANTATION

MOGULLA Manohar and CLAYTON Philip

Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital

Aims: Delayed graft function (DGF) in deceased donor kidney transplantation is associated with worse outcomes. DGF has been less well studied in living donor transplantation. We aimed to examine the risk factors for DGF, and associations between DGF and short and long-term outcomes, in living donor kidney transplant recipients.

Methods: Using data from the Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry, we included living donor kidney transplants performed in Australia and New Zealand over 2004–2014 and excluded pediatric recipients (n=405), Pathological donors (n=88), grafts that failed in the first week (as a proxy for primary non-function) (n=30), and grafts with missing DGF data (n=42). We used multivariable logistic regression to identify risk factors for DGF and the association between DGF and rejection at 6 months; Cox proportional hazards models to examine the relationships between DGF and patient and graft survival; and linear regression to examine the association between DGF and eGFR at 1 year.

Results: DGF occurred in 70 (2.3%) of 3098 transplants. Risk factors for DGF included right-sided kidney (odds ratio [OR] 1.92 [95% CI 1.11-3.31]), primary disease (highest risk in those with diabetic nephropathy, OR 3.33 [1.67-6.64]), increasing time on dialysis and total ischemic time (OR 1.10 per hour [1.02-1.19]). DGF was associated with increased risk of rejection, worse patient and graft survival, and worse renal function at 1 year (table).

Conclusion: DGF is uncommon after living donor kidney transplantation, but associated with significantly worse outcomes. The only modifiable risk factor identified was total ischemic time.

Table

Table

Back to Top | Article Outline

EMPAGLIFLOZIN USE IN THE MANAGEMENT OF DIABETES FOLLOWING CARDIAC TRANSPLANTATION

MUIR Christopher1, GREENFIELD Jerry1, HAYWARD Christopher2, and MACDONALD Peter2

1Department of Endocrine and Metabolism, St Vincent's Hospital, Sydney, 2Department of Cardiology, St Vincent's Hospital, Sydney

Aims: Empagliflozin, a sodium-glucose cotransporter-2 (SGLT2) inhibitor is a novel oral therapy for the treatment of diabetes. Significant cardiovascular and renal benefits related to empagliflozin use have been shown in the non-transplant setting, but safety and efficacy in transplant recipients has not been reported.

Methods: Retrospective single center audit of attendees to a heart transplant clinic between 01/01/2016 - 31/08/2016.

Results: 316 patients attended clinic over the study period, of which 106 (34%) had a diagnosis of diabetes mellitus. In diabetic patients, 19 were commenced on empagliflozin. Two patients commenced treatment prior to transplantation, with the remainder starting between 1 month and up to 22 years post cardiac transplantion. Treatment duration ranged from 2–21 months. Pre-post data with at least 3 months of follow-up was available in 16 (84%) patients. Empagliflozin use resulted in a significant reduction in weight, BMI, systolic and diastolic blood pressure (p <0.05; Table 1). Improvement in weight and blood pressure occurred despite a significant mean reduction in frusemide dose (p =0.05). Renal function was unaffected by treatment with empagliflozin and there was a mean reduction in HbA1c of 0.6% (p >0.05).

Empagliflozin was well tolerated. One patient reported dizziness and another reported polyuria, but no serious adverse events or genitourinary infections were documented over the study period.

Conclusions: In patients with diabetes and history of cardiac transplantation, empagliflozin was associated with significant reductions in weight and blood pressure. In our experience, it was well tolerated and safe in the transplant setting with over 147 cumulative months of treatment to date.

TABLE 1

TABLE 1

Back to Top | Article Outline

Organ Donation and Ethics

DONOR REFERRALS AT INCREASED RISK FOR BLOOD BORNE VIRUSES IN NEW SOUTH WALES, 2010–2015

WALLER Karen1, ROSALES Brenda1, THOMSON Imogen2, WYBURN Kate3,2, O'LEARY Michael4,5, RAMACHANDRAN Vidiya6, RAWLINSON William6, and WEBSTER Angela C1,7

1School of Public Health, University of Sydney, 2School of Medicine, University of Sydney, 3Renal Unit, Royal Prince Alfred Hospital, Sydney, 4NSW Organ and Tissue Donation Service, 5Intensive Care Unit, Royal Prince Alfred Hospital, Sydney, 6Virology, SEALS, Prince of Wales Hospital, Sydney, 7Renal Transplant Unit, Westmead Hospital, Sydney

Aim: To investigate characteristics and outcomes of donor referrals at increased-risk for hepatitis B (HBV), hepatitis C (HCV), and human immunodeficiency virus (HIV) in NSW.

Methods: We conducted a retrospective cohort study of the NSW Organ and Tissue Donation Service logs, of all organ donor referrals from 2010–2015. We reviewed referrals with blood borne virus (BBV) history or high-risk behaviours, comparing characteristics of potential donors (did not proceed), intended donors (abandoned before organ retrieval) and actual donors. Reasons for donation variability were evaluated. Referrals were linked with SEALS Pathology BBV serology and nucleic acid testing (NAT) results.

Results: Of 2,998 donation referrals (2,331 potential, 667 intended/actual), 309 (10%) had documented BBV history (82), high-risk behaviours for BBV (119), or both (108). The mean age of increased-risk referrals was lower (46.1 vs. 57.4, p<0.01, Student t-test). The most common infection was HCV (160), and most common high-risk behaviour was injecting drug use (191). Of the 309 increased-risk referrals, 249 remained potential donors and 60 became intended/actual donors. 103 referrals did not proceed primarily due to concern of BBV transmission risk. Despite concerns, 84% of these 103 had no BBV testing requested. Table 1 shows results for the 16 of these 103 referrals that were tested.

Conclusions: Many referrals at increased risk for BBV did not proceed to transplantation due to concern of BBV transmission risk without BBV testing. Opportunities to increase donation rates among these referrals should be considered, particularly as new risk paradigms evolve regarding effective treatments for HCV.

TABLE 1

TABLE 1

Back to Top | Article Outline

THE EFFECT OF RELIGION AND ETHNICITY ON DONOR ORGAN REFERRAL OUTCOMES IN NSW, 2010–2015

WALKER John1, WYBURN Kate2,3, O'LEARY Michael4,2,5, HEDLEY James6, ROSALES Brenda1, KELLY Patrick1, and WEBSTER Angela1,7

1School of Public Health, University of Sydney, 2Sydney Medical School, University of Sydney, 3Renal Unit, Royal Prince Alfred Hospital, Sydney, 4NSW Organ and Tissue Donation Service, 5Intensive Care Service, Royal Prince Alfred Hospital, Sydney, 6School of Public Health, Victorian Transplantation and Immunogenetics Service, Melbourne, 7Centre for Kidney Research, Westmead Hospital, Sydney

Aims: Culturally and linguistically diverse populations in Australia highlight issues surrounding death, grieving and organ donation for multicultural communities. Qualitative literature indicates that these issues may impact the donation pathway. We aimed to quantify the association between religion/ethnicity and organ donation outcomes in NSW’s unique cultural demographics.

Methods: We reviewed NSW Organ and Tissue Donation Service referral logs for 2010–2015. We performed random effects logistic regressions (clustered by referral hospital) to determine whether religion and/or ethnicity were associated with donation outcomes, and at what stage of the donation pathway this impact was apparent.

Results: Among all 2,975 referrals, religion and ethnicity were independently associated with progression to organ retrieval (“actual donor”) (p<0.001) and family refusal (p<0.001). Family refusal to donation was more likely in referrals who identified as Chinese Asian, Indigenous Australian,1 Southern European or Maori compared to North-West European (see Table 1). Similarly, family refusal was more likely in referrals who identified as Buddhist compared to Christian. Of 2,306 referrals without family refusal (78%), both religion and ethnicity were associated with progression to “intended donor” (medical suitability) (p<0.001). However, of 668 intended donors (22%), neither religion (p=0.23) nor ethnicity (p=0.57) were associated with progression to actual donor.

Conclusions: This study provides evidence of a divergence in donation outcomes for referred donors from ethnically and religiously diverse backgrounds in NSW. These results support the practice of tailoring a culturally sensitive and inclusive approach to donation in both policy and clinical practice.

TABLE 1

TABLE 1

Back to Top | Article Outline

OBESITY AND ACCESS TO THE DECEASED DONOR KIDNEY WAIT LIST AND TRANSPLANTATION

LADHANI Maleeka1,2,3, CRAIG Jonathan C2,1,4, and WONG Germaine2,1,5

1School of Public Health, University of Sydney, 2Centre for Kidney Research, The Children's Hospital at Westmead, Sydney, 3Renal Unit, Other, 4Department of Renal Medicine, The Children's Hospital at Westmead, Sydney, 5Centre for Transplant and Renal Research, Westmead Hospital, Sydney

Aims: Obesity at the time of transplantation is not a risk factor for adverse graft outcomes but obese patients may be less likely to be listed on the deceased donor waiting list and subsequently transplanted. The aim was to determine the association between obesity and wait-listing and access to transplantation in potential transplant candidates on dialysis.

Methods: Using data from the Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry, we assessed the association between obesity (by BMI category) and time to wait-listing and time to deceased-donor transplantation in adult dialysis patients (18–70 years) in Australia using adjusted Cox proportional hazard models (2007–2014).

Results: Of 11,688 patients included, 4,432 (37.9%) were obese. Over a follow-up period of 26,255 (for wait-listing) and 31,966 patient-years (for transplantation), 3,529 were listed (28.5% obese) and 1695 transplanted (28.9% obese). Being male, on peritoneal dialysis or having polycystic kidney disease led to an increased likelihood of listing and transplantation. Co-morbidities led to a decreased likelihood of both listing and transplantation. The likelihood of being wait-listed was 23% less in those with obesity compared to those with normal weight [adjusted HR 0.77 (95%CI 0.71-0.84), p<0.001]. The likelihood of being transplanted was 20% less in those with obesity compared to those with normal weight [adjusted HR 0.80 (95%CI 0.70-0.91), p<0.001].

Conclusions: It is 23% less likely that dialysis patients with obesity are wait-listed and, if listed, 20% less likely to be transplanted compared to those with normal weight. Further examination of listing practices may be required.

Figure

Figure

Back to Top | Article Outline

THE LEGALITY OF KIDNEY PAIRED DONATION IN AUSTRALIA, CANADA, AND THE UNITED STATES: AN ARRAY OF LEGAL INTERPRETATIONS AND APPROACHES TO THE “VALUABLE CONSIDERATION” PROBLEM

TOEWS Maeghan1,2, GIANCASPRO Mark1,3, RICHARDS Bernadette1,4, and FERRARI Paolo5,6

1University of Adelaide Law School, University of Adelaide, 2Canadian National Transplant Research Program, 3Supreme Court of South Australia, 4Research Unit for the Study of Society, Law and Religion, University of Adelaide, 5Department of Nephrology, Prince of Wales Hospital, Sydney, 6Clinical School, University of New South Wales, Sydney

Aims: The legality of kidney paired donation (KPD) has been questioned in jurisdictions that prohibit exchanging organs for “valuable consideration”. The legal uncertainty created by this term has been interpreted and addressed in distinct ways in Australia, Canada, and the United States. To better understand these differences and to inform future donation efforts that may face similar legal uncertainty, this work legally analyses the meaning of “valuable consideration” in this context and compares the legal approaches taken to KPD in these countries.

Methods: Legal scholarship methods, including statutory interpretation, comparative legal analysis, and a review of jurisprudence, were undertaken to understand the legal meaning of “valuable consideration” in each jurisdiction.

Results: Each jurisdiction has unique “valuable consideration” provisions (Table 1). The common law meaning of “consideration” has not historically been limited to things of monetary value and likely encompasses KPD. As a result, legislative amendments were passed in Australia and the U.S. to permit KPD programs to operate, while Canada’s KPD program operates in defiance of the law. The Australian amendment allows for individual exemptions from the “valuable consideration” prohibition. This approach has created an administrative burden for KPD, but provides greater legal certainty than in Canada and greater flexibility to accommodate future policy developments than in the U.S.

Conclusion: The law needs to keep pace with ethically sound developments in donation practice and policy. To this end, future legal reform may be necessary in the U.S. and Canada. The Australian approach serves as a useful example in this regard.

TABLE 1

TABLE 1

Back to Top | Article Outline

EARLY IMPACT OF HCV DIRECT ANTIVIRAL AGENTS (DAAS) IN THE LIVER ORGAN DONATION SPACE.

MCCAUGHAN G.W1, ADAMS L2, CHEN F1, CRAWFORD M1, STRASSER S1, and JEFFREY G2

1Australian National Liver Transplantation Unit, Royal Prince Alfred Hospital, Sydney, 2WA Liver & Kidney Transplant Service, Sir Charles Gairdner Hospital, Perth

New DAA therapies for HCV infection have cure rates in up to 99% of patients. The uptake of these therapies in Australia since their introduction on March 1st 2016 has been remarkable with over 25,000 patients treated by July 1st 2016. We have encountered three situations where the availability of successful DAA therapy has impacted on our decision making at the time of organ donation offer.

Case 1: A decision was made to use a HCV Ab + PCR + Liver from a BD organ donor to transplant into a non-HCV infected recipient who was a permanent inpatient with alcoholic cirrhosis with Hepatorenal Syndrome and hyponatraemia. Recipient HCV infection was detected on day 1 (log 7, genotype 1b). DAA was commenced on day 39 post transplant. The patient is now HCV RNA negative at week 8 on treatment.

Case 2: A HCV Ab, PCR positive BD donor was transplanted into a recipient with HCV cirrhosis and HCC but who was HCV RNA negative at 7 weeks with DAA. Due to early post-operative renal injury the recipient will recommence treatment at 8 weeks post transplant.

Case 3: An ABO incompatible HCV Ab + PCR negative donor was transplanted into a HCV negative recipient with fulminant hepatic failure. The donor had been previously HCV RNA positive but had been cured of HCV with DAA therapy > 3 months previously. There is no evidence of HCV infection in the recipient at week 3 post transplant without DAA therapy.

Conclusion: The introduction of the new DAAs for HCV infection could have widespread implications for liver (and also for non liver) organ donation. Based on these experiences a new TSANZ guideline for the use of donor organs from HCV infected patients is under development.

Back to Top | Article Outline

THE PSYCHOSOCIAL IMPACT OF DONATING HAEMATOPOIETIC STEM CELLS ON ADULT SIBLING DONORS

ZOMERDIJK Nienke1,2,3, TURNER Jane4,5, HILL Geoff3,1,2, and GOTTLIEB David6

1Department of Bone Marrow Transplantation, Royal Brisbane Hospital, 2School of Medicine, University of Queensland, Brisbane, 3Bone Marrow Transplantation Laboratory, Queensland Institute of Medical Research, Brisbane, 4Discipline of Psychiatry, University of Queensland, Brisbane, 5Department of Cancer Care, Royal Brisbane Hospital, 6Department of Bone Marrow Transplantation, Westmead Hospital, Sydney

Background: While World Marrow Donor Association Standards ensure consistency in the assessment and care of unrelated donors, no such criteria exist for sibling donors. Unlike unrelated donors who are often unaware of their transplant recipient’s outcome, sibling donors are actively involved in the transplant process and witness first-hand the changes in their sibling’s health. Their lived experience is thus likely to be very different from that of unrelated donors and it is important to address their needs.

Aim: To identify potential predictors of psychosocial distress and unmet needs of adult sibling donors, to inform guidelines and the development of a web-based intervention.

Methods: Participants are adults undergoing HSC donation for a sibling recipient at the RBWH and Westmead Hospital, Sydney. Donors complete 3 interviews and provide 3 samples of saliva (as a biomarker of stress): (1) 2 weeks pre-HSC collection; (2) day of HSC collection; (3) 30 days post-HSC collection. An interview with BMT Coordinators will explore aspects of psychosocial care and formalised strategies for adverse donor outcomes.

Results: As research into adult sibling HSC donors is limited, findings from sibling bone marrow donors were used to make predictions:

1. Perceived adequacy of preparation and emotional support and recipient relationship closeness are associated with post-donation reactions such as concern for own health, physical pain/discomfort, guilt, responsibility and stress.

2. Donors who give a poor evaluation of the recipient’s health report negative reactions throughout the process.

3. Donor reactions such as stress/physical pain correlate with salivary α-amylase levels throughout the process.

Back to Top | Article Outline

INCREASED LIVER NON-USE: THE IMPLICATIONS OF THE SHIFT TOWARDS DONATION AFTER CIRCULATORY DEATH IN AUSTRALIA

FORREST Elizabeth1,2, REILING Janske3,4,5,6, BRIDLE KR3,7, BRITTON LJ3,8, SANTRAMPURWALA N3,9, CRAWFORD DHG3,8, DEJONG CHC10, and FAWCETT J11,1,12

1Queensland Liver Transplant Service, Princess Alexandra Hospital, Brisbane, 2Resident Medical Officer, Gold Coast Hospital and Health Service, 3School of Medicine, University of Queensland, Brisbane, 4Australian and New Zealand Organ Donation, ANZDATA, 5Gallipoli Medical Research Institute, Greenslopes Private Hospital, 6Department of Surgery, NUTRIM - School for Nutrition and Translational Research in Metabolism, Maastricht University, the Netherlands, 7Gallipoli Medical Research Institute, Other, 8Other, Other, 9Other, 10Department of Surgery, NUTRIM - School of Nutrition and Translational Research in Metabolism, Maastricht University, Maastricht, the Netherlands, 11University of Queensland, Brisbane, 12PA Research Foundation, Princess Alexandra Hospital, Brisbane

Aims: The number of donor livers available is the primary limiting factor to Australian transplantation rates. Annually, 9% of patients on the waitlist succumb to their disease. The aim of this study was to evaluate the impact of donation after circulatory death (DCD) and other factors associated with organ quality on liver utilisation rates in Australia.

Methods: Data was retrieved from the ANZOD Registry on Australian organ donors who donated at least a single organ between 2005 and 2014. Analysis of temporal donor characteristics was conducted and a logistical regression analysis was performed to assess associations with liver non-use.

Results: Between 2005 to 2014, the annual number of organ donors increased from 175 to 344, and a 71% increase in livers available for transplantation was seen. At the same time, an increase in the percentage of livers deemed unsuitable for transplantation was seen from 24% in 2005 to 41% in 2014 (p< 0.001, Figure 1). On multivariable analysis, liver non-use was most significantly associated with DCD donors with an OR of 25.88 ((95% CI: 18.84–35.56), p< 0.001). Following by donor age, obesity and diabetes.

Conclusions: This study demonstrated an increase in overall organ donation rates in Australia and a significant decrease in the percentage of livers used, with DCD donation as the most important independent risk factor for liver non-use. Machine perfusion to assess graft function prior to transplantation has been proposed. This method requires further evaluation to potentially increase the number of livers available for transplant in Australia.

FIGURE 1

FIGURE 1

Back to Top | Article Outline

COMORBIDITIES INFLUENCING OUTCOME OF ORGAN DONOR REFERRALS IN NEW SOUTH WALES (NSW): COHORT STUDY 2010–2015

THOMSON Imogen1, ROSALES Brenda2, KELLY Patrick2, WALLER Karen1, WYBURN Kate3,1, O'LEARY Michael1,4,5, and WEBSTER Angela1,2,6,7

1School of Medicine, University of Sydney, 2School of Public Health, University of Sydney, 3Renal Unit, Royal Prince Alfred Hospital, Sydney, 4Intensive Care Service, Royal Prince Alfred Hospital, Sydney, 5NSW Organ and Tissue Donation Service, 6Centre for Transplant and Renal Research, Westmead Hospital, Sydney, 7Centre for Kidney Research, The Children's Hospital at Westmead, Sydney

Aim: To identify a means of eliminating potential organ donors who are unlikely to donate early in the referral process, and to quantify the impact of comorbidities on referral outcome.

Methods: We reviewed NSW Organ and Tissue Donation Service referral logs 2010–2015, considering the presence or absence of cardiac disease, vascular disease, chronic liver disease (CLD), chronic kidney disease (CKD), respiratory disease, cerebrovascular disease (CVD), hypertension, hyperlipidaemia, diabetes, malignancy, age ≥65 and hospital location. KDRI and Charlson scores were calculated for 2014–2015 referrals. Outcomes were donating (donors) and not donating (non-donors). Non-donors whose families declined to consent for donation were excluded. Logistic regression (odds ratio (OR), 95% confidence intervals (95%CI)) was used to identify comorbidities significantly influencing referral outcome, and to predict probability of becoming a donor given comorbidities.

Results: Of 2977 referrals, family refusals excluded 669 (22%) non-donors, leaving 2310 (78%) referrals in our analysis, of whom 668 (29%) donated and 1642 (71%) did not. Comorbidities (Table 1), in addition to non-metropolitan hospital and absence of hyperlipidaemia were significantly (P<0.01) associated with a referral not donating. Of these, malignancy (OR 3.91, 95%CI 2.71-5.66) and CKD (OR 3.45, 95%CI 1.99-5.98) had the greatest impact. Our model (area under receiver-operating-characteristic curve (AUC) 0.74) was superior to Charlson (AUC 0.63) and KDRI (AUC 0.52) scores for predicting referral outcome.

Conclusions: Key elements of this model could be used to identify referrals unlikely to become donors, and improve the efficiency of the donor referral process in NSW.

TABLE 1

TABLE 1

Back to Top | Article Outline

DOES SPLIT RENAL FUNCTION CORRELATE WITH KIDNEY SIZE AND PREDICTED POST DONATION FUNCTION IN LIVING KIDNEY DONATION – A SINGLE CENTRE PROSPECTIVE STUDY.

ABROL Nitin1, ASOKAN Gayatri1, RUSSELL Christine H2, OLAKKENGIL Santosh1, and BHATTACHARJYA Shantanu1

1Renal Transplant Unit, Royal Adelaide Hospital, 2Renal Unit, Royal Adelaide Hospital

Aim: The aims of this study were to assess whether the difference in split function correlated with the difference in kidney size reported on CT and whether the predicted post donation eGFR (as a %split of eGFR (MDRD)) had a positive correlate with the actual eGFR post donation.

Methods: All voluntary kidney donors from 2014 till November 2016 were included in this study. eGFR (MDRD) as well as GFR with split function measured by Tc-DTPA nuclear medicine scan on presentation. Donors who were medically fit were further assessed by a CT renal angiogram for formal anatomical assessment. Following donation donors were reviewed at 1 week and 4 weeks with calculation of eGFR using the MDRD formula. For the purpose of this study, serial eGFR was used as a marker of functional outcome at post op day 1 and at 4 weeks.

Results: Prospectively collected data from 51 consecutive donors between 2014 and 2016 was reviewed. Mean difference in size of two kidneys was 0.23 cm (Left > Right) and mean difference in split function was 1.6% (Left > Right). There was very weak correlation between the size difference and difference in split function of two kidneys (r= 0.2549; r2 = 0.065). There was weak positive correlation between the actual early post-operative remaining renal function and the predicted theoretical remaining renal function (R2 = 0.5274; R = 0.7262). (Figure 1).

Conclusion: This study demonstrates that split function assessment has poor correlation to kidney size and is of limited value in predicting post donation function in the longer term. Further studies with larger sample size are required to assess the true utility of split function assessment in living kidney donor assessment.

FIGURE 1

FIGURE 1

Back to Top | Article Outline

UTILITY OF A NEW END STAGE KIDNEY DISEASE (ESKD) RISK CALCULATOR IN LIVING KIDNEY DONOR CANDIDATE (LKDC) ASSESSMENT

LEE Darren1,2, MANZOOR Momena1, WHITLAM John2, HARLEY Geoff2, SANDIFORD Megan2, GIBSON Charlotte1, CHOY Suet-wan1, COOK Natasha2, MCMAHON Lawrence1, and ROBERTS Matthew1

1Department of Renal Medicine, Eastern Health, Melbourne, 2Department of Nephrology, Austin Hospital, Melbourne

Aims: The Kidney Disease Improving Global Outcomes guideline draft recommends assessment of LKDCs based on baseline lifetime ESKD risk estimated by a new risk calculator (Grams ME et al NEJM 2016). We hypothesised that estimated ESKD risk was higher in declined compared with accepted LKDCs.

Methods: Using the risk calculator (transplantmodels.com/esrdrisk), estimated baseline ESKD risk without donation was retrospectively compared in 89 accepted and 59 declined LKDCs from 2 centres between 2007 and 2015.

Results: Demographic differences between declined and accepted LKDCs are detailed in the table. Compared with accepted LKDCs, baseline ESKD risk at 15 years was higher in declined LKDCs (median 0.14% (IQR 0.09-0.31%) versus 0.10% (0.07-0.14%); P<0.001) and more likely to be >0.5% (9% versus 0%; P<0.001). Baseline lifetime ESKD risk was not significantly higher (0.39% (0.23-0.80%) versus 0.35% (0.22-0.56%)); however it was more likely >1% (15% versus 1%; P<0.01). Reasons to decline were fully accounted for by the risk calculator in only 39% of declined LKDCs, and their lifetime ESKD risk in this subgroup was higher compared with accepted LKDCs (0.56% (0.24-1.12%) versus 0.35% (0.22-0.56%); P<0.05). Common reasons to decline were metabolic risk (36%) and inadequate GFR (32%).

Conclusions: Declined LKDCs had higher baseline 15-year but not lifetime ESKD risk. Decision-making was likely influenced by more precise medium-term rather than lifetime ESKD risk estimation. However, virtually no LKDCs with a baseline lifetime ESKD risk threshold of >1% were accepted. Despite its limitations, this risk calculator may complement but not replace the complexity of LKDC assessment.

Table

Table

Back to Top | Article Outline

END STAGE KIDNEY DISEASE (ESKD) RISK PROFILE IN LIVING KIDNEY DONORS (LKDS) FOR ADULT VERSUS PAEDIATRIC KIDNEY TRANSPLANT RECIPIENTS (KTRS) IN AUSTRALIA AND NEW ZEALAND

LEE Darren1,2, COOK Natasha1, WHITLAM John1, WALKER Amanda3, ROBERTS Matthew2, IERINO Frank4, and KAUSMAN Joshua3

1Department of Nephrology, Austin Hospital, Melbourne, 2Department of Renal Medicine, Eastern Health, Melbourne, 3Department of Nephrology, Royal Children's Hospital, Melbourne, 4Department of Nephrology, St Vincent's Hospital, Melbourne

Aims: LKDs, mostly parents, have a heightened motivation to benefit their paediatric KTRs, given the recognised neurocognitive, growth, psychosocial, cardiovascular and survival benefits compared with dialysis. We hypothesised that younger LKDs with higher ESKD risk were more often accepted for paediatric KTRs.

Methods: Using a risk calculator (Grams ME et al NEJM 2016), the estimated baseline ESKD risk without donation of LKDs for paediatric versus adult KTRs (2005–2014) from ANZDATA was retrospectively compared. Only 50 of 298 and 425 of 3362 LKDs for paediatric and adult KTRs respectively (post-2009) were analysed due to unavailability of urine albumin/creatinine ratios.

Results: Compared with adult KTRs, LKDs for paediatric KTRs were significantly younger (median age 44 (IQR 36–50) versus 53 (44–60); P<0.001) and more likely parents (88% vs 23%; P<0.001). Baseline 15-year ESKD risk was lower (0.08% (0.05-0.10%) versus 0.11% (0.07-0.17%); P=0.001) while lifetime ESKD risk was higher (0.42% (0.33-0.64%) versus 0.37% (0.23-0.58%); P<0.05). The 90th, 95th and 98th percentiles for lifetime ESKD risk estimates in LKDs for paediatric versus adult KTRs were 1.38% versus 0.93%, 1.71% versus 1.22% and 2.12% versus 1.85% respectively. The proportion of LKDs with lifetime ESKD risk threshold >1% (12% versus 8%) and >2% (2% versus 2%) was however similar.

Conclusions: LKDs for paediatric KTRs have lower 15-year but higher lifetime baseline ESKD risk compared with adult KTRs, primarily driven by younger LKD parents. However, the absolute risk difference is minor. These data may improve informed consent from future LKD candidates for both adult and paediatric KTRs.

Table

Table

Back to Top | Article Outline

WHAT SHOULD BE THE END STAGE KIDNEY DISEASE (ESKD) RISK THRESHOLD TO DECLINE A LIVING KIDNEY DONOR CANDIDATE (LKDC)?

LEE Darren1,2, ROBERTS Matthew1, MOUNT Peter2, COOK Natasha2, SOMERVILLE Christine3, HOLMES Christopher4, GOODMAN David5, and IERINO Frank5

1Department of Renal Medicine, Eastern Health, Melbourne, 2Department of Nephrology, Austin Hospital, Melbourne, 3Department of Renal Medicine, Barwon Health, VIC, 4Department of Renal Medicine, Bendigo Health, VIC, 5Department of Nephrology, St Vincent's Hospital, Melbourne

Aims: The decision to accept or decline LKDCs often lacks consistency and transparency. Kidney Disease Improving Global Outcomes (KDIGO) recently released a guideline draft, which recommends each transplant centre to set its own threshold based on the lifetime ESKD risk in LKDCs. A validated risk calculator (transplantmodels.com/esrdrisk) is now available to estimate the baseline lifetime ESKD risk without donation. We aimed to determine an agreed threshold of declining LKDCs.

Methods: Nephrologists, transplant surgeons and advanced trainees from 5 renal units were invited to self-complete an anonymous online survey regarding their thresholds to decline LKDCs.

Results: 31 of 61 invitees responded. Most common age group was 35–44 (42%) followed by 45–54 (26%). Most common lifetime post-donation ESKD risk threshold was 1% (48%) followed by 2% (21%), but 14% would accept any risk with informed consent. There was a lack of consensus for the baseline lifetime ESKD risk threshold without donation, ranging from 1% (24%), 2% (17%), 0.5% (17%), 0.2% (14%) and any risk being acceptable (10%). 54% considered 30 as the minimum acceptable donor age while 11% would consider adults of any age competent to give informed consent. Factors influencing respondents’ thresholds included LKDC/recipient relationship, potential psychosocial benefit to LKDC, degree of benefit to recipient and LKDC’s commitment.

Conclusions: This study demonstrates a lack of consensus to establish an ESKD risk threshold to decline LKDCs in the nephrology community, as recommended by KDIGO. The highly altruistic and complex nature of kidney donation will continue to influence the threshold for individual LKDCs.

Back to Top | Article Outline

OUTCOMES FOLLOWING NEONATAL ORGAN DONATION: A SYSTEMATIC REVIEW

VAGG DJ1,2, LAURENCE J3,2, WYBURN K4,5, CAVAZZONI E6,5, HAWTHORNE W7,2, HAMEED A1,2, and PLEASS H1,8

1Department of Surgery, Westmead Hospital, Sydney, 2Discipline of Surgery, Sydney Medical School, University of Sydney, 3Australian National Liver Transplantation Unit, Royal Prince Alfred Hospital, Sydney, 4Department of Medicine, Royal Prince Alfred Hospital, Sydney, 5Discipline of Medicine, University of Sydney, 6Department of Medicine, The Children's Hospital at Westmead, Sydney, 7National Pancreas Transplant Unit, Westmead Hospital, Sydney, 8Discipline of Surgery, Sydney Medical School, Westmead Hospital, Sydney

Significant disparity exists between the number of patients requiring transplantation and available organ donors. Current strategies to address the paucity of donors have focussed on expanding donor criteria in paediatric and elderly populations. Despite this, however, potential allografts from neonatal donors (i.e., age between 37 weeks and 12 months) remain an under utilised resource. At present fewer than 1% of organ donors are younger than one year of age. As a result, robust outcome data following neonatal donation is lacking which creates a significant barrier to acceptance. The aim of this study was to systematically review published outcome data following neonatal organ donation.

Data were obtained from MEDLINE, EMBASE and PubMed databases. Twelve neonatal donors were identified; three followed circulatory determination of death (DCD). The youngest reported neonatal liver donor was 10 days old; weight 2.9 kilograms. There was immediate graft function with correction of coagulopathy. Liver function remained normal at 6 year follow-up. The youngest neonatal kidney donor was 6 days old (N=11; range 6 days to 11 months); weight 3.1 kilograms (range 3.1kg to 7.5kg). All patients demonstrated improvement in creatinine and eGFR at both one month and one year follow-up. Comparative data demonstrated no significant difference in allograft survival compared with kidneys obtained from paediatric donors (25–48 months). One perinephric haematoma required evacuation; there were no graft thromboses.

These data demonstrate the feasibility of transplantation following neonatal organ donation. We propose that organs from carefully selected neonatal donors be considered for transplantation.

Back to Top | Article Outline

THE PERSPECTIVES OF HAEMATOPOIETIC STEM CELL TRANSPLANT RECIPIENTS: A SYSTEMATIC REVIEW OF QUALITATIVE STUDIES

JAMIESON Nathan1,2, MANERA Karine1,2, CHAPMAN Jeremy1,3, CRAIG Jonathan1,2, GOTTLIEB David4,5,6,7, HANSON Camilla1,2, JU Angela1,2, SHAW Peter8,9, and TONG Allison1,2

1Centre for Kidney Research, The Children's Hospital at Westmead, Sydney, 2School of Public Health, University of Sydney, 3Centre for Transplant and Renal Research, Westmead Hospital, Sydney, 4School of Medicine, University of Sydney, 5BMT Cell Therapies, The Westmead Institute for Medical Research, 6Westmead Institute of Cancer Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 7Blood and Marrow Transplant Service, Department of Haematology, Westmead Hospital, Sydney, 8Department of Oncology, The Children's Hospital at Westmead, Sydney, 9Discipline of Paediatrics and Child Health, University of Sydney

Context: Haematopoietic stem cell transplantation (HSCT) is increasingly used as a life-saving treatment for a number of malignant and non-malignant conditions. However, HSCT recipients may experience ongoing physical and psychosocial challenges that limit overall treatment benefit.

Aims: We aimed to describe the experiences and perspectives of HSCT recipients.

Methods: MEDLINE, Embase, PsycINFO, CINAHL were searched from database inception to October 2016. We used thematic synthesis to analyse the findings.

Results: Seventy-three studies involving 2296 participants aged from 18 to 76 years across 17 countries were included. We identified five themes: incapacitating frailty and fatigue (debilitating infirmity, disconcerting concentration deficits, lost sexuality and fertility, fragile immunity, declarative appearance, mourning the past self, burden of dependence); financial vulnerability and isolation (overwhelming financial strain, stigma and job insecurity, disconnected from society); ubiquitous uncertainty (relentless fear of relapse, graft as a threat, harbouring survivor guilt, struck by unexpected consequences); taking ownership of health (optimising health, bodily alertness, gaining reassurance in cell counts, keeping informed and involved, trusting in healthcare providers); and reshaping identity (accepting patienthood, reflective gratitude, developing mental fortitude, valuing the strengthened self, staying future focused, relinquishing normality, discovering peer support).

Conclusions: HSCT recipients report physical and mental fatigue in addition to lost social and financial opportunities. These experiences occur in the context of ongoing concerns about the potential for disease relapse and graft-associated complications. Patient education and psychosocial services that support vocational and financial well-being, foster realistic outcome expectations, and encourage effective self-management post-transplant may optimise outcomes in this population.

Back to Top | Article Outline

TRANSPLANT PROFESSIONALS’ ATTITUDES AND APPROACHES TO THE LIVING KIDNEY DONOR-RECIPIENT RELATIONSHIP: INTERVIEW STUDY

RALPH Angelique F1,2,3, BUTOW Phyllis2,4,5, CRAIG Jonathan C1,3, CHAPMAN Jeremy R6, GILL John7, KANELLIS John8,9,10, and TONG Allison1,3

1School of Public Health, University of Sydney, 2School of Psychology, University of Sydney, 3Centre for Kidney Research, The Children's Hospital at Westmead, Sydney, 4Psycho-oncology Co-operative Research Group, University of Sydney, 5Centre for Medical Psychology & Evidence-based Decision-making, University of Sydney, 6Centre for Transplant and Renal Research, Westmead Hospital, Sydney, 7Division of Nephrology, University of British Columbia, 8Department of Nephrology, Monash University, Melbourne, 9Monash Health and Centre for Inflammatory Diseases, Monash University, Melbourne, 10Department of Medicine, Monash University, Melbourne

Background: Assessment of the donor-recipient relationship is recommended by international guidelines to prevent undue coercion and ensure realistic expectations. We aimed to describe attitudes and experiences of transplant professionals on the donor-recipient relationship in living kidney donation.

Methods: Semi-structured interviews were conducted with 53 transplant professionals (nephrologists, surgeons, coordinators, social workers, psychiatrists and psychologists). Transcripts were analysed thematically.

Results: Four themes were identified: protecting vulnerability (ensuring genuine motivation, uncovering precarious dynamics, shared accountability, necessity of psychosocial input, trusting emotional bonds, overriding emotional decision making); safeguarding against coercion (navigating power dynamics, wary of ethical boundaries, managing opacity, understanding interpersonal dynamics); fostering the bond (hoping for strengthened connection, giving equitable attention to donors and recipients); and mitigating against relationship strains (preempting conflict, acknowledging relationship change, ensuring realistic expectations).

Conclusion: Transplant professionals regarded the donor-recipient relationship as the driving moral imperative of the donation and thus believe that assessing the donor-recipient relationship is ethically necessary to minimise the risk of undue coercion and to protect donors and recipients. However, some feel challenged in disentangling altruism and voluntariness from the potential pressures of familial and societal duty they believe donors may not disclose, and question the level of justifiable medical paternalism.

Back to Top | Article Outline

Sensitisation, Antibodies, ABO Incompatible Transplantation

ASSOCIATION BETWEEN EPLET HLA-MISMATCHES, DE NOVO DSA PRODUCTION AND RISK OF REJECTION IN PAEDIATRIC KIDNEY TRANSPLANT RECIPIENTS

SHARMA Ankit1,2, TAVERNITI Anne1, LEWIS Joshua1,3,2, ALEXANDER Steve1, LIM Wai3,4, CRAIG Jonathan1,2, and WONG Germaine1,2,5

1Centre for Kidney Research, The Children's Hospital at Westmead, Sydney, 2School of Public Health, University of Sydney, 3School of Medicine & Pharmacology, University of Western Australia, Perth, 4Department of Renal Medicine, Sir Charles Gairdner Hospital, Perth, 5Centre for Transplant and Renal Research, Westmead Hospital, Sydney

Aim: To determine the association between number of eplet HLA-mismatches at transplantation with the development of de novo DSA (dnDSA) and antibody mediated rejection (AMR) in paediatric kidney transplant recipients.

Methods: Eplet HLA mismatches for each recipient and donor pair at HLA class I and II loci were calculated using HLA matchmaker. Adjusted logistic regression analyses were conducted to determine the association between the number of total and class specific eplet HLA mismatches, dnDSA production and AMR.

Results: A total of 43 (31 (72%) living and 12 (28%) deceased donors) paediatric recipients who received their first allograft between November 2005 and March 2015 in NSW were included. The mean (±SD) age at transplantation was 9±5 years and mean follow up after transplantation was 40±26 months. The mean Class I and class II eplet mismatches were 13±7 and 23±15. Overall 18 (42%) patients developed dnDSA [Class I (n=3, 17%), Class II (n=6, 33%), and both (n=9,50%)]. An increased risk of dnDSA production was observed with increasing number of Class I (per 10 eplet increase adjusted odds ratio (aOR) 4.8 95%CI 1.1-21.6, P=0.041), and Class II eplet mismatches (aOR 1.8 95%CI 1.0-3.2, P=0.05). Among those with biopsy data (n=27, 63%) the presence of Class I and/or II dnDSA were associated with increased risk of AMR (11.1% vs. 77.8%, P=0.001 and 12.5% vs. 63.6%, P=0.011 respectively).

Conclusion: Increasing number of eplet HLA mismatches are associated with at least a 4-fold increased risk of dnDSA production. Paediatric transplant recipients with any dnDSA production experienced at least a 5-fold increased risk of AMR. Further work should examine whether inclusion of eplet HLA matching may better predict long-term graft survival.

Back to Top | Article Outline

OUTCOME OF ABO-INCOMPATIBLE LIVING DONOR KIDNEY TRANSPLANTATION – A SINGLE CENTRE REPORT

TAN Bee Qung, CLAYTON Philip, COATES Toby, RUSS Graeme, CARROLL Robert, and FAULL Randall

Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital

Background: ABO-incompatible (ABOi) kidney transplantation using various desensitization strategies has become an alternative to expand the donor pool and minimize the shortage of kidneys for transplantation. We hereby present the outcome of ABOi kidney transplantation at the Central Northern Adelaide Renal and Transplantation Service (CNARTS) from July 2008 to December 2016.

Methods: A total of 28 adult ABOi kidney transplantations were included in the study. Two recipients transferred their care to an interstate centre within 1 month. Pre-transplantation desensitization protocol consisted of rituximab, plasmapheresis or immunoabsorption, and intravenous immunoglobulin (IVIG), and maintenance immunosuppression consisted of tacrolimus, mycophenolate and prednisolone. Anti-ABO antibody titres were assayed using indirect antiglobulin test column technology. Baseline anti-ABO titre varied from 1:1 to 1:512. A post-desensitization anti-ABO titre of ≤ 1:8 was considered acceptable for transplantation.

Results: 68% of the recipients were blood group O. 93% had immediate graft function post-transplantation. Mean serum creatinine at 12-month was 127 μmol/L. One graft was lost due to antibody-mediated rejection (ABMR) in associated with a rebound increase in anti-ABO titre (1:64), not salvaged by plasma exchange and eculizumab, leading to transplant nephrectomy on day 9. Graft and patient survival was 96% and 100% at 12-months respectively. 1 patient developed BK nephropathy and 1 patient underwent a native nephrectomy for renal cell carcinoma 6-years post-transplantation.

Conclusion: In our hands, ABOi kidney transplantation is a safe alternative treatment for end-stage kidney disease, with favourable short-term outcome and minimal complications.

Back to Top | Article Outline

HLA-DQA AND B EPLET MISMATCHES AND DE NOVO DONOR-SPECIFIC ANTI-HLA-DQ ANTIBODY

FIDLER Samantha1, IRISH Ashley2, D'ORSOGNA Lloyd1, and LIM Wai3

1Department of Clinical Immunology, PathWest, Fiona Stanley Hospital, 2Department of Nephrology, Fiona Stanley Hospital, Perth, 3Department of Nephrology, Sir Charles Gairdner Hospital, Perth

Aim: To determine the association between human leukocyte antigen (HLA)-DQA and DQB eplet mismatches and development of de novo donor-specific anti-HLA-DQ antibody (DQ-DSA).

Methods: High-resolution four-digit molecular typing at the HLA-DQ allele in a cohort of 264 donor/recipient pairs in Western Australia between 2003–07 was performed by Sanger sequencing. The number of eplet mismatches at the HLA-DQA and HLA-DQB alleles was calculated using HLAMatchmaker. Routine post-transplant Luminex single antigen bead assay was performed to detect the presence of de novo DSA. Association between eplet mismatches at the HLA-DQA and HLA-DQB alleles and development of de novo DQ-DSA was examined using logistic regression. Accuracy of eplet mismatches at the HLA-DQA and HLA-DQB alleles in predicting DQ-DSA was determined by Receiver Operating Characteristics Area Under the curve (AUC).

Results: Of the 264 recipients, 60 (23%) developed class II DSA, with 44/60 (73%) being DQ-DSA. The median (25-75th quartiles) number of eplet mismatches at the HLA-DQA and DQB alleles were 5 (0–15) and 9 (1–13), respectively. Compared to 0–2 HLA-DQA eplet mismatches, 3–10, 11–20 and >20 eplet mismatches were associated with odds ratios of 2.81 (95%CI 0.96, 8.27; p=0.06), 3.49 (95%CI 1.30, 9.36; p=0.01) and 7.91 (95%CI 1.31, 47.93; p=0.02), respectively. There was no association between incremental HLA-DQB eplet mismatches and DQ-DSA (0–2: referent; 3–10: 1.87 [95%CI 0.65, 5.38; p=0.24]; 11–20: 1.80 [95%CI 0.61, 5.35; p=0.29]; >20: 2.37 [95%CI 0.41, 13.87; p=0.34]). AUC for HLA-DQA eplet mismatches in predicting DQ-DSA was higher compared to HLA-DQB eplet mismatches (0.67 vs. 0.63; p=0.35).

Conclusion: There is a stronger association between HLA-DQA eplet mismatches and development of DQ-DSA compared to HLA-DQB mismatches. This study suggests that donor/recipient HLA-DQA typing should be routinely performed prior to kidney transplantation.

Figure

Figure

Back to Top | Article Outline

THE TREATMENT OF ANTIBODY-MEDIATED REJECTION IN KIDNEY TRANSPLANT RECIPIENTS – AN UPDATED SYSTEMATIC REVIEW

WAN Susan1,2, YING Tracey1,2, WYBURN Kate1,2, and CHADBAN Steve1,2

1Department of Renal Medicine, Royal Prince Alfred Hospital, Sydney, 2Sydney Medical School, University of Sydney

Aims: To determine the efficacy and recent trends in treatment of acute antibody-mediated rejection (AMR) in kidney transplant recipients.

Background: A range of therapies have been trialed for the management of AMR, an important cause of graft loss. A previous systematic review reported weak evidence of benefit for Rituximab and Bortezomib.

Methods: We conducted a systematic review of controlled trials in paediatric and adult kidney transplant recipients using Medline, EMBASE and CENTRAL from inception to November 2016. We assessed outcomes of graft failure and adverse events.

Results: Of 14,318 citations, we identified 19 controlled trials (7 new since 2011), of which 11 were non-randomised. One trial involved paediatric recipients with a mix of acute cellular rejection and AMR.

Trials evaluated plasma exchange (7); immunoadsorption columns (1); Bortezomib vs Rituximab (4); Rituximab (5) and C1-esterase inhibition (2) (Table 1). Plasma exchange and IVIG were frequently used as standard of care. 4/19 trials demonstrated a trend towards overall benefit with respect to graft survival. However, these studies were heterogeneous in terms of intervention type, definition of AMR, quality and follow-up. Adverse events were reported in variable detail, with a trend towards under reporting.

TABLE 1

TABLE 1

Conclusions: Since 2011, further studies of Rituximab and Bortezomib therapies have not shown a clear benefit in the treatment of AMR. However, studies were generally under-powered and of low quality.

The optimal treatment of AMR remains a priority for future RCTs, which should focus on high quality data addressing clinically-relevant outcomes.

Back to Top | Article Outline

OUTCOMES IN ABO-INCOMPATIBLE LIVING KIDNEY TRANPLANT PATIENTS–SINGLE CENTRE EXPERIENCE

LAI Sum Wing Christina1, WAN Susan1,2, UTSIWEGOTA Mike1, and WYBURN Kate1,2

1Renal & Transplantation Unit, Royal Prince Alfred Hospital, Sydney, 2University of Sydney

Background: ABO-incompatible kidney transplantation (ABOi-KTx) is an established practice that was introduced to expand the donor pool. Here we report medium term ABOi-KTx outcomes from a single centre.

Methods: We performed a retrospective study including all ABOi-KTx recipients between July 2007–July 2016, compared with living and deceased donor ABO-compatible kidney transplants. Cox proportional hazards survival analysis and logistic regression were used to assess outcomes, including: patient and graft survival, graft function, rejection, and development of de-novo donor-specific antibodies (DSA).

Results: 59 ABOi-KTx and 131 ABO-compatible kidney transplants were performed during the study period, 121 (64%) of the total cohort were male with a mean age of 48 (SD 14.2) years. The ABOi group had less male donors (p=0.021) and fewer patients with pre-transplant DSA (p=0.027). 31 (53%) ABOi-KTx were blood group A to O. Median pre-desensitisation ABO antibody titre was 1:16 (IQR 1:4-1:32) and the median post-desensitisation titre was 1:1 (IQR 1:1-1:2). Desensitisation used in the ABOi group included: IVIg/Immunoadsorption (n=33), Rituximab/IVIg/Immunoadsorption (n=18), IVIg/Immunoadsorption/plasmapheresis (n=6), Immunoadsorption/ plasmapheresis (n=1) and IVIG alone (n=1). Graft and patient survival of ABOi-KTx was 92.9% and 83.8% at 5 years post-transplant. There was no significant difference in graft or patient survival, serum creatinine, antibody- and cell-mediated rejection, or development of de-novo DSA when ABOi-KTx was compared to ABO compatible transplants.

Conclusions: These results support the use of ABOi-KTx as a safe alternate option, given similar outcomes when compared to ABO-compatible transplantation.

Back to Top | Article Outline

THE ASSOCIATION BETWEEN DE NOVO DONOR SPECIFIC ANTIBODIES AND ADHERENCE IN ADOLESCENT KIDNEY TRANSPLANT RECIPIENTS

AMIR Noa1,2,3, MACKIE Fiona1,3, SCAMMEL Rebecca4, WATSON Narelle4, HAHN Deirdre2,5, and KENNEDY Sean3,1

1Department of Nephrology, Sydney Children's Hospital, 2Department of Nephrology, The Children's Hospital at Westmead, Sydney, 3School of Women's & Children's Health, University of New South Wales, Sydney, 4Tissue Typing Laboratory, Australian Red Cross Blood Service, 5Centre for Kidney Research, The Children's Hospital at Westmead, Sydney

Aims: To describe the prevalence of de novo donor specific antibodies (DSA) in adolescents after kidney transplants and to determine the factors which may contribute to their development.

Methods: A cross sectional study of recipients aged 10 to 18 years managed at two Australian paediatric centres. HLA IgG antibodies were detected using Luminex. A mean fluorescence intensity (MFI) > 500 was considered positive. The coefficient of variation (CoV) for immunosuppressant levels (tacrolimus, cyclosporine or sirolimus) was calculated for the 12 readings prior to antibody testing. Other factors analysed were recipient age and gender; donor type; HLA matching; time since transplant and preceding acute rejection.

Results: The study included 25 recipients, mean age 15 ± 2.5 years, mean time since transplant 70.2 ± 43.1 months. De novo Class II DSAs were present at MFI >500 in 8 (32%); 6 also had HLA Class I DSAs. The MFI for individual Class II DSAs ranged from 815 to 22619. The maximum MFI for Class I was 1282. The only differences between the recipients with no DSA and those with de novo Class II DSA were the CoV for immunosuppressants (20% vs. 42% respectively; p = 0.009) and number of HLA mismatches (2.3 ± 1.2 vs. 4.3 ± 1.5, respectively; p=0.002). Both factors were significantly associated with de novo DSA in a multiple logistic regression analysis.

Conclusions: De novo Class II DSA are relatively common in adolescents. Independent risk factors are HLA mismatches and poorer adherence, as assessed by variability of immunosuppressant levels.

Back to Top | Article Outline

Outcome Measures

A SYSTEMATIC REVIEW AND META-ANALYSIS OF COLD IN SITU PERFUSION AND PRESERVATION PRIOR TO PANCREAS AND LIVER PROCUREMENT: TIME FOR A UNIFIED APPROACH

HAMEED AM1,2,3, PLEASS HC1,3,4, and HAWTHORNE WJ1,2,3

1Department of Surgery, Westmead Hospital, Sydney, 2Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 3School of Medicine, University of Sydney, 4Department of Surgery, Royal Prince Alfred Hospital, Sydney

Aims: There is currently no uniform consensus regarding in situ cold perfusion and preservation for pancreas and liver procurement. We therefore aimed to identify the ideal in situ perfusion route(s), volume and preservation solution(s) for pancreas and liver procurement.

Methods: The Embase, Medline and Cochrane databases were utilized. Short and longer-term transplantation outcomes were separately analyzed for the pancreas and liver using semi-quantitative methods and/or meta-analyses. Comparator groups included: dual versus aortic-only perfusion, University of Wisconsin (UW) compared to histidine-tryptophan-ketoglutarate (HTK) and/or Celsior perfusion/preservation solutions, and venous versus arterio-venous hepatic back-table flushing.

Results: Forty-one articles were included (959 pancreas and 3,358 liver transplants). HTK compared to UW aortic-only pancreas perfusion resulted in a higher peak lipase (standardized mean difference 0.47, 95% CI 0.23-0.71) and graft pancreatitis rate (risk ratio 2.16, 95% CI 1.29-3.60). Median one-year pancreas graft survivals were 90%, 81% and 95.9% in the UW, HTK and Celsior groups, respectively. Initial graft function, thrombotic graft loss rates, biliary complications and one-year survival in hepatic allografts were no different for livers perfused/preserved with UW, HTK or Celsior. Hepatic aortic-only compared to dual perfusion provides advantages in terms of time and cost savings with no detriment to graft outcomes.

Conclusions: Short-term whole-organ pancreas graft outcomes are better when UW is utilized compared to HTK, whilst HTK may also impact graft survival. However, there is no significant difference between UW, HTK or Celsior for liver perfusion and preservation. Aortic-only perfusion with UW provides a more universally applicable technique for multi-organ procurement.

Back to Top | Article Outline

IMPACT OF EARLY CONVERSION FROM CYCLOSPORIN TO EVEROLIMUS ON LEFT VENTRICULAR MASS INDEX (LVMI): A RANDOMISED CONTROLLED TRIAL

LIM Wai1, KRISHNAN Anoushka1, TEXEIRA-PINTO Armando2, CHAN Doris1, CHAKERA Aron1, DOGRA Gursharan1, BOUDVILLE Neil3, MORGAN Kelly1, PHILLIPS Jessica1, IRISH Ashley4, and WONG Germaine5

1Sir Charles Gairdner Hospital, Perth, 2University of Sydney, 3University of Western Australia, Perth, 4Fiona Stanley Hospital, 5Westmead Hospital, Sydney

Aim: To determine the effect of cyclosporin and everolimus-based immunosuppressive regimens on LVMI.

Methods: This is an 18-month prospective, randomized controlled trial (RCT) designed to compare the effect of early conversion from cyclosporin to everolimus and mycophenolic acid (E-MPA) between 3–4 months post-transplant to cyclosporin with mycophenolic acid (CsA-MPA) on LVMI at 3 and 18 months post-transplant (primary outcome). Secondary outcomes included estimated glomerular filtration rate (eGFR), viral infection, adverse events and drug discontinuation.

Results: Of 32 patients recruited between 2010-12 in Western Australia, 24 eligible patients were randomised in a 1:1 ratio to either treatment group between 3-4 months post-transplant. There were no significant differences in mean (SD) LVMI at 3 (51.6±18.5 vs. 53.7±15.7g/m2.7) and 18 months (52.7±16.3 vs. 51.7±16.8g/m2.7) between CsA-MPA and E-MPA groups. Of those randomized to E-MPA, there was a change in LVMI by -0.97 unit (95%CI -16.13, 14.18, p=0.895) compared to CsA-MPA. Mean eGFR increased by +11ml/min/1.73m2 between 3 and 18 months post-transplant in the E-MPA group, compared to +3ml/min/1.73m2 in the CsA-MPA group (p=xxx). The incidence of viral infections was reduced in E-MPA compared to CsA-MPA treatment groups (8% vs. 50%, p=0.02), but the incidences of acute rejection, adverse events and drug discontinuation were similar between groups.

Conclusion: Immunosuppressive regimen comprising of early conversion from cyclosporine to everolimus was not associated with a regression of LVMI, but a lower risk of viral infections was observed in the conversion arm. There is insufficient evidence to suggest that everolimus can be considered as the preferred immunosuppressive agent in kidney transplant recipients with an unfavourable cardiovascular risk profile.

Back to Top | Article Outline

BARRIERS TO IMMUNOSUPRESSANT MEDICATION ADHERENCE: AN OBSERVATIONAL STUDY IN ADULT RENAL TRANSPLANT PATIENTS

COSSART Amelia1, COTTRELL Neil1, CAMPBELL Scott2, ISBEL Nicole2, and STAATZ Christine3

1School of Pharmacy, University of Queensland, Brisbane, 2Department of Nephrology, Princess Alexandra Hospital, Brisbane, 3Department of Pharmacy, University of Queensland, Brisbane

Background: Immunosuppressant medication non-adherence can result in kidney graft rejection. The aim of this observational study was to determine the prevalence of non-adherence to immunosuppressant medications in a local adult renal transplant cohort, and investigate barriers to adherence.

Methods: Kidney transplant patients completed a self-report survey consisting of five validated questionnaires (Basel Assessment of Adherence Immunosuppression Scale (BAASIS), Beliefs about Medicines Questionnaire, Immunosuppressant Therapy Barrier Scale, Brief-Illness Perception Questionnaire (Brief-IPQ), and Multidimensional Health Locus of Control Scale), and sociodemographic information. Adherence was categorised according to BAASIS responses, and patient beliefs and sociodemographic characteristics were compared between the groups.

Results: A total of 161 patients completed the survey. Eighty-six participants (55%) were categorised as non-adherent, with 44% delaying doses, and 26% skipping doses. Non-adherent patients had more barriers to their adherence (p=0.02), were less able to determine if their medication(s) were helping them (p = 0.02) and were more likely to forget doses (p=0.005), or skip doses when their daily routine changed (p<0.001)) or short of money (p = 0.03). Additionally, non-adherent patients had less self-reported understanding about their graft than adherent patients (p = 0.008, Brief-IPQ). Adherence was not associated with patient medicine beliefs or locus of control (Table 1).

TABLE 1

TABLE 1

Conclusions: Over half the adult kidney transplant patients self-reported non-adherence to their immunosuppressant medication. The main barriers leading to non-adherence were forgetfulness and missing doses when the daily routine changed. Personalised interventions focused on habit forming may support medication adherence in this population.

Back to Top | Article Outline

OUTCOME OF RENAL TRANSPLANT RECIPIENTS WHEN RETURNING TO DIALYSIS

YEAP Chii Yeat, CLAYTON Phillip, and COATES Toby

Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital

Aims: The outcomes of patients returning to dialysis after graft failure remain unclear. Some studies have shown higher mortality compared with incident dialysis patients without previous transplant especially in the first 12 months of initiation of dialysis. The leading causes of death in these patients are cardiovascular events and infections which are associated with modifiable non-immunological factors from poorly controlled chronic kidney disease (CKD) complications. The study aims to investigate patient survival on dialysis after graft failure and clinical parameters at initiation of dialysis.

Methods: Data from Central and Northern Adelaide Renal and Transplantation Service (CNARTS) were used to determine the haemoglobin, ferritin, transferrin saturation, serum albumin, parathyroid hormone (PTH), estimated glomerular filtration rate (eGFR) and type of vascular access among 48 patients with failed kidney transplants who initiated dialysis between January 2012 and June 2016.

Results: At dialysis initiation, the mean haemoglobin, ferritin, transferrin saturation, serum albumin, PTH and eGFR were 93 g/L, 388 ug/L, 29%, 61.2 pmol/L, 28 g/L, and 7.9 mL/min/1.73 m2 respectively. More than 50% of the 8 patients that died during the study period happened in the first 6 months of initiation of dialysis and 60% of them from cardiovascular events.

Conclusions: Despite being known to specialty physicians, patients with failed kidney transplants initiate dialysis at levels of haemoglobin, serum albumin, PTH and eGFR that may be suboptimal with high rates of initiation with dialysis catheters. Improved CKD care may improve the outcomes of this unique subgroup of patients.

Back to Top | Article Outline

THE WEEKEND EFFECT: ANALYSING TEMPORAL TRENDS IN SOLID ORGAN DONATION AND TRANSPLANT

CHANG Nicholas1, KELLY Patrick1, WYBURN Kate2,3, O'LEARY Michael2,4,5, HEDLEY James1, ROSALES Brenda1, and WEBSTER Angela1,6

1School of Public Health, University of Sydney, 2Sydney Medical School, University of Sydney, 3Renal Unit, Royal Prince Alfred Hospital, Sydney, 4NSW Organ and Tissue Donation Service, 5Intensive Care Service, Royal Prince Alfred Hospital, Sydney, 6Centre for Kidney Research, Westmead Hospital, Sydney

Aims: Some research suggests that hospital patients admitted or treated on the weekend experience poorer outcomes and higher mortality. Recent studies in the US indicate that although weekend referral does not influence transplant recipient outcomes, it does adversely affect organ retrieval rates. We sought to characterize the effect of day-of-week and weekday/weekend status on progression from referral to donation, in NSW.

Methods: We retrospectively reviewed all NSW Organ and Tissue Donation Service logs for 2010–2015. We compared donation outcomes (potential, intended and actual donation) of referrals by day-of-week and by weekday/weekend (Saturday/Sunday referral), excluding family refusals. Temporal trends in donation were evaluated using logistic regressions adjusted for random effects.

Results: Of 2,975 total referrals, 2,306 referrals (1,640 potential, 94 intended, 572 actual donors) were not refused by the family of the deceased; 1804 of these were weekday referrals while 502 were weekend referrals. Weekend referrals were not significantly more likely than weekday referrals to progress to actual donation (adjusted odds ratio: 1.18; 95% CI 0.83-1.69). Similarly, there was no significant variation in actual donation rates by day-of-week. In the total 2,975 referrals, family refusal rates were similar between weekend and weekday referrals (adjusted odds ratio: 1.08; 95% CI 0.85-1.36) and between different days of the week.

Conclusions: In NSW, neither day-of-week nor weekday/weekend status of donor referral had a significant effect on the rate of progression from referral to actual donation or the rate of family refusal. These findings stand in contrast to results from the US.

TABLE 1

TABLE 1

Back to Top | Article Outline

TOWARD ESTABLISHING CORE OUTCOME DOMAINS FOR TRIALS IN KIDNEY TRANSPLANTATION: STANDARDISED OUTCOMES IN NEPHROLOGY – KIDNEY TRANSPLANTATION (SONG-TX) CONSENSUS WORKSHOPS

TONG Allison1,2, GILL John3, BUDDE Klemens4, MARSON Lorna5, REESE Peter6, ROSENBLOOM David7, ROSTAING Lionel8, WONG Germaine9, JOSEPHSON Michelle10, PRUETT Timothy11, WARRENS Anthony12, CRAIG Jonathan13, SAUTENET Benedicte13, EVANGELIDIS Nicole13, RALPH Angelique2, HANSON Camilla13, SHEN Jenny14, HOWARD Kirsten13, MEYER Klemens15, PERRONE Ronald15, WEINER Daniel15, FUNG Samuel16, MA Maggie17, ROSE Caren3, RYAN Jessica18, HOWELL Martin13, LARKINS Nick13, KIM Siah13, JU Angela13, and CHAPMAN Jeremy9

1Sydney School of Public Health, The University of Sydney, 2Centre for Kidney Research, The Children's Hospital at Westmead, Sydney, 3Division of Nephrology, University of British Columbia, 4Department of Nephrology, Charité - Universitätsmedizin Berlin, 5Transplant Unit, University of Edinburgh, 6Renal Division, University of Pennsylvania Perelman School of Medicine, 7ESRD Network 18, ESRD Network 18, 8Clinique Universitaire de Nephrologie, CHU Michallon, 9Centre for Transplant and Renal Research, Westmead Hospital, 10Department of Medicine, The University of Chicago, 11Department of Surgery, University of Minnesota, 12School of Medicine and Dentistry, Queen Mary University of London, 13School of Public Health, University of Sydney, 14Department of Nephrology, Los Angeles Biomedical Research Insitute at Harbor-UCLA Medical Center, 15William B. Schwartz Division of Nephrology, Tufts Medical Center, 16Jockey Club Nephrology & Urology Centre, Princess Margaret Hospital, 17The University of Hong Kong, Queen Mary Hospital, 18Department of Nephrology, Monash Medical Centre

Aims: Shared decision-making in kidney transplantation requires patients and clinicians to balance the risks of mortality, graft survival, medical comorbidities, symptoms, and quality of life. However, the heterogeneity and lack of patient-relevant outcomes across trials makes these trade-offs uncertain; thus the need for a core outcome set that reflects stakeholder priorities.

Methods: We convened two international SONG-Tx stakeholder consensus workshops in Boston (17 patients/caregivers; 52 health professionals) and Hong Kong (10 patients/caregivers; 45 health professionals). In facilitated breakout groups, participants discussed the development and implementation of core outcome domains for trials in kidney transplantation.

Results: We identified seven themes. Clarifying the paramount importance of graft outcomes encompassed the prevailing fear of dialysis, distilling the meaning of graft function, and acknowledging the terrifying and ambiguous terminology of rejection. Reflecting critical trade-offs between graft health and medical comorbidities was fundamental. Contextualising mortality explained discrepancies in the prioritisation of death among stakeholders – inevitability of death (patients), preventing premature death (clinicians), and ensuring safety (regulators). Imperative of capturing patient-reported outcomes was driven by making explicit patient priorities, fulfilling regulatory requirements, and addressing life participation. Specificity to transplant; feasibility and pragmatism (long-term impacts and responsiveness to interventions); and recognising the gradients of severity within outcome domains were raised as considerations.

Conclusions: Stakeholders support the inclusion of graft, mortality, medical, and patient-reported outcomes for a relevant and comprehensive core outcomes set for decision making in kidney transplantation. Addressing obscure terminology, transplant-specificity, and feasibility may be needed in establishing core outcomes in trials in kidney transplantation.

Back to Top | Article Outline

ESTABLISHMENT OF A TOOL TO PREDICT 5Y GRAFT AND PATIENT SURVIVAL FOLLOWING RENAL TRANSPLANTATION IN PATIENTS WITH TYPE 2 DIABETES MELLITUS (T2DM)

GOODMAN David1, ULLAH Shahid2,3, and MCDONALD Stephen2,3

1Department of Nephrology, St Vincent's Hospital, Melbourne, 2Central Northern Adelaide Renal and Transplantation Service, ANZDATA, 3Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital

Aims: Current TSANZ guidelines recommend 5-year patient survival of ≥80% before listing for deceased donor kidney transplantation. We aim to generate a simple tool to estimate graft and patient survival based on known risk factors prior to transplantation.

Methods: Data for all Australian and New Zealand transplant recipients aged 18 years or older who received a first transplant between 2005–2014 was extracted from ANZDATA data base. Of the 7 187 transplants performed 1036 were to T2DM recipients. Detailed demographics and medical comorbidities at the time of transplantation were utilized in the analysis. These included, age at transplant, gender, BMI, smoking status, history of coronary artery disease (CAD), cerebrovascular disease (CVD) and peripheral vascular disease (PVD), indigenous status, duration on dialysis, live-v-deceased donor kidney and HLA mismatches. Multivariate Cox Proportional Hazard Model was used to predict graft and patient survival. The relative risk for each factor was used to set up an algorithm to calculate the risk graft loss and patient death within 5years of transplantation.

Results: The 5y risk of death of 2 patients with T2DM were 14.4% and 14% respectively. Patient 1 was 50y male, smoker with PVD, CAD and CVD and 4 years on dialysis compared to patient 2, 70y female, non-smoker, with no CAD, CVD and PVD. Both patients have predicted survival within the TSANZ guidelines (Table 1).

TABLE 1

TABLE 1

Conclusions: This simple tool can assist clinicians in estimating patient and graft survival following kidney transplantation based on Australian and New Zealand data.

Back to Top | Article Outline

IMPROVED OUTCOMES FOLLOWING KIDNEY TRANSPLANTATION FOR PATIENTS WITH TYPE 2 DIABETES MELLITUS (T2DM)

GOODMAN David1, ULLAH Shahid2,3, and MCDONALD Stephen2,3

1Department of Nephrology, St Vincent's Hospital, Melbourne, 2Central Northern Adelaide Renal and Transplantation Service, ANZDATA, 3Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital

Aims: The prevalence of T2DM as a cause of end stage renal failure has increased over the past 20 years. This study compares the graft and patient survival for patients with T2DM who received a kidney transplant to non-T2DM transplant recipients.

Methods: Data for all Australian and New Zealand transplant recipients aged 18 years or older who received a first transplant between 1995–2014 was extracted from ANZDATA data base. The data for the past 20 years was divided into 4, 5 year cohorts. Multivariate Cox Proportional Hazard Model was used to predict graft and patient survival.

Results: Of the 12 229 patients transplanted almost 12% (1404) had T2DM. The proportion of patients with T2DM who received transplants has climbed from 1.4% (1995) to 12.4% (2014). Almost 50% of all T2DM transplants occurred in the last 5 years and over 74% in the last 10 years. Over the past 20 years there has been a progressive improvement in both patient and graft survival for T2DM and non-T2DM. The improvement for graft (p=0.33) and patient survival (p=0.91) in non-T2DM appears the have plateaued when 2005-2009 is compared to 2010-2014 whereas the T2DM had a significant improvement between the 2 cohorts for both graft survival (p=0.04) and patient survival (p=0.03).

Conclusions: The proportion of patients with T2DM has increased nine-fold over the past 20 years. The increased number of transplants to T2DM recipients has been accompanied by a progressive improvement in both graft and patient survival. Unlike non-T2DM where the improvement has plateaued T2DM recipients show continued improvement.

Back to Top | Article Outline

RISK FACTORS THAT INFLUENCE GRAFT AND PATIENT SURVIVAL FOLLOWING RENAL TRANSPLANTATION IN PATIENTS WITH TYPE 2 DIABETES MELLITUS (T2DM)

GOODMAN David1, ULLAH Shahid2,3, and MCDONALD Stephen2,3

1Department of Nephrology, St Vincent's Hospital, Melbourne, 2Central Northern Adelaide Renal and Transplantation Service, ANZDATA, 3Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital

Aims: The number of patients with End Stage Renal Failure and T2DM continues to climb. This study compares the known risk factors at the time of transplantation between patients with and without T2DM and the influence on transplant outcomes.

Methods: Data for all Australian and New Zealand transplant recipients aged 18 years or older who received a first transplant between 2005 and 2014 was extracted from ANZDATA data base. Age at transplant, gender, BMI, smoking status, history of coronary artery disease (CAD), cerebrovascular disease (CVD) and peripheral vascular disease (PVD), indigenous status, duration on dialysis, live-v-deceased donor kidney and HLA mismatches. Multivariate Cox Proportional Hazard Model was used to predict graft and patient survival.

Results: T2DM recipients were on average 8.5 years older, had greater obesity (BMI>30: 41.9-v-20.6%) and more CAD (42-v-13%), CVD (9.8-v-4.4%) and PVD (26.8-v-4.7%). Indigenous Australians were more common in T2DM group (10.4-v-2%) than their counterparts. T2DM were more likely to have longer dialysis time (2.9-v-2.1 y), receive a deceased donor kidney (74.7-v-59.2%) with poorer tissue matching than non-T2DM recipients.

Statistically significant factors influencing graft survival include. Recipient age non-T2DM but not T2DM recipients. Donor age for both groups. Of the medical co-morbidities, only PVD was a significant risk factor in T2DM whereas smoking, CAD & PVD were risk factors for non-T2DM. Risk for graft failure was increased almost two-fold in Indigenous Australian recipients. Longer dialysis duration was a risk factor for both groups. A similar pattern was seen with factors influencing patient survival. Again, PVD was the only significant co-morbidity influencing patient survival in the T2DM group. In contrast, smoking, CAD and PVD were all significant risk factors on non-T2DM.

Conclusions: The risk factors that influence graft survival vary between T2DM and non-T2DM recipients and these differences should be taken into consideration when evaluating patients for kidney transplantation.

Back to Top | Article Outline

ALL-CAUSE MORTALITY AFTER KIDNEY ALLOGRAFT LOSS

LIM Wai1, WONG Germaine2, CHADBAN Steve3, PILMORE Helen4, CLAYTON Phil5, and MCDONALD Stephen5

1Sir Charles Gairdner Hospital, Perth, 2Westmead Hospital, Sydney, 3Royal Prince Alfred Hospital, Sydney, 4Auckland City Hospital, 5Royal Adelaide Hospital

Aim: To determine the association between retransplantation and mortality after kidney allograft loss.

Methods: Using Australia and New Zealand Dialysis and Transplant (ANZDATA) registry, the association between study groups (patients who were retransplanted vs. not retransplanted [i.e. remained on dialysis] after allograft loss) and all-cause mortality were examined using adjusted Cox regression analysis. Incidence of all-cause and cause-specific mortality between study groups were calculated.

Results: Of 5770 patients with failed primary kidney allografts between 1980–2014, 2330 (40.4%) were retransplanted. Retransplanted patients were younger (mean [SD] 35.4 [13.3] vs. 46.5 [14.7], p<0.01) at graft loss, but had shorter mean first allograft duration (6.0 [6.4] vs. 7.0 [7.0], p<0.01) compared to those who were not retransplanted. Chronic allograft nephropathy was the commonest cause of graft loss in patients who were and were not retransplanted (53% vs. 56%); but graft loss attributed to acute rejection (21% vs. 14%) or vascular complications (9% vs. 5%) were more common in retransplanted patients (p<0.01). Of patients who were not retransplanted, 76% died within 5 years of graft loss, compared to 17% in those who were retransplanted (p<0.01). Cardiovascular disease was the most frequent cause of mortality after graft loss for patients who were and were not retransplanted (32% vs. 38%), whereas cancer mortality was almost 4-times as common in patients who were retransplanted (15% vs. 4%; p<0.01). Compared to patients who were not retransplanted, retransplanted patients were less likely to die following graft loss, with hazard ratio (HR) of 0.22 (95%CI 0.20, 0.24), adjusted for age at graft loss, donor type, duration of first graft and comorbidities. Following exclusion of patients who had died within the first year post-graft loss, the adjusted HR for all-cause mortality was 0.27 (95%CI 0.24, 0.30; Kaplan Meier survival curve shown below).

Conclusion: Following failed kidney allografts, patients who were not retransplanted were at a greater risk of mortality compared to those who were retransplanted, with over 75% of deaths occurring within 5 years of allograft loss.

Figure

Figure

Back to Top | Article Outline

QUADRICEPS STRENGTH AND INTENSIVE CARE DURATION ARE INDEPENDENT PREDICTORS OF POST-HEART TRANSPLANT SIX MINUTE WALK DISTANCE

GOUGH Lauren1,2, MCKENZIE Scott3, YERKOVICH Stephanie4, KELLY Rebecca1,5, WONG Yee Weng6, HING Wayne7, JAVORSKY George5, and WALSH James1,5,8

1Physiotherapy, Prince Charles Hospital, Brisbane, 2Physiotherapy, Bond University, Gold Coast, Queensland, 3Discipline of Medicine, Advanced Heart Failure and Cardiac Transplant Unit, The Prince Charles Hospital, Brisbane, Queensland, 4Statistics, Queensland Lung Transplant Service, The Prince Charles Hospital, Brisbane, Queensland, 5Physiotherapy, Advanced Heart Failure and Cardiac Transplant Unit, The Prince Charles Hospital, Brisbane, Queensland, 6Department of Medicine, Advanced Heart Failure and Cardiac Transplant Unit, The Prince Charles Hospital, Brisbane, Queensland, 7Physiotherapy, Bond University, Gold Coast, Queensland, Australia, 8Physiotherapy, Queensland Lung Transplant Service, The Prince Charles Hospital, Brisbane

Aims: To determine predictors of early recovery in post-heart transplant exercise capacity as measured by the six minute walk distance (6MWD).

Methods: All isolated heart transplant recipients at a single institution between 2012 and 2016 were considered for inclusion. Demographics; pre-transplant maximal oxygen uptake, cardiac index and transpulmonary gradient; post-transplant mechanical ventilation time and intensive care duration were recorded. Left ventricular ejection fraction, 6MWD and quadriceps strength corrected for body weight (QS%) were recorded pre-and post-transplant with the initial 6MWD post-transplant (3 weeks post-transplant) used to assess initial changes in exercise capacity.

Results: Forty-eight participants of mean (±SD) age 44.5 ± 16.0 years were studied. There was no significant difference between 6MWD pre-transplant (382.4 ± 111.9m) and initial 6MWD post-transplant (367.9 ± 114.3m; p=0.617). On multivariate analysis, independent predictors of 6MWD post-transplant were QS% (β=2.658; 95% CI 1.799 to 3.518, p<0.001), and days spent in intensive care (β=−2.755; 95% CI −5.257 to −0.253, p=0.032). The only identified independent predictor of 6MWD (post-transplant compared to pre-transplant) was QS% (r2=0.56, p<0.001).

Conclusions: Limitations in exercise capacity post-heart transplantation were largely explained by reduced quadriceps strength and prolonged intensive care duration. Our findings suggest that further controlled trials are needed to better understand the influence of quadriceps strength on recovery of exercise capacity post-transplantation.

Back to Top | Article Outline

LONGER ANASTOMOTIC TIME LEADS TO DELAYED GRAFT FUNCTION

HEER Munish1,2, TREVILLIAN Paul3, MAHAJAN Nikhil4, and HIBBERD Adrian1

1Department of Surgery, John Hunter Hospital, Newcastle, 2Empty, Envoi Pathology, Brisbane, 3Renal & Transplantation Unit, John Hunter Hospital, Newcastle, 4Renal Transplant Unit, John Hunter Hospital, Newcastle

Introduction: Delayed graft function (DGF) occurs at variable rates in renal transplant recipients. Various factors lead to its development. The impact of Anastomotic time (AT) on DGF is not very clear.

Aim: To analyse the incidence of DGF in our centre. We also analysed the impact of AT on DGF.

Method: All renal transplant recipients at our center from 2006 were included. Data was gathered from electronic health records. Patients were divided into two groups based on presence or absence of DGF, which was defined as requirement of dialysis in first-week post-transplant. Non-parametric tests were used to compare AT of two groups. The relationship was determined by binary log regression using other confounders in the model i.e. age and sex of recipient, donor's age, Cold ischemia time and multiple arteries.

Results: 186 deceased donor renal transplant recipients were included. 62.4% recipients were male. Median age of recipient was 54.29 years (IQR 43.64, 63.42) and median donor age was 54.72 years (IQR 43.06, 64.81). 58/186 (31.2%) of deceased donor recipients experienced DGF. Median AT was 43 min. (IQR 36, 50). Median Cold ischemia time was 840 min (IQR 69, 990). There was significant difference between AT of two groups (p=0.002). No difference was found in cold Ischemia times of the two groups. In Binary logistic regression test, AT was independently associated with DGF (OR 1.04 per minute, 95% CI 1.012, 1.072 p= 0.006).

Conclusion: Anastomotic time had significant impact on the development of DGF hence should be minimised.

Back to Top | Article Outline

HIGH PROTEIN SUPPLEMENTATION EARLY POST LUNG TRANSPLANT IS FEASIBLE AND MAY BE ASSOCIATED WITH MORE PRESERVED QUADRICEPS STRENGTH

HICKLING Donna F1, FIENE Andreas1, CHAMBERS Daniel C1,2, HOPKINS Peter MA1,2, YERKOVICH Stephanie T1, and WALSH James R1,3

1Lung Transplant Service, Prince Charles Hospital, Brisbane, 2School of Medicine, University of Queensland, Brisbane, 3School of Allied Health Sciences, Griffith University

Aims: Improved nutritional status post-lung transplant (LT) has been associated with better outcomes. This study aimed to determine the feasibility and tolerability of high protein (HP) supplements and the impact on exercise capacity recovery post-LT.

Methods: Prospective randomised controlled trial comparing routine post-operative care with an intervention of twice daily HP supplements delivered <60minutes post-exercise (providing 18g protein, 1250kJ/supplement) for six weeks post-LT. Patients were randomised at LT and stratified for Cystic Fibrosis (CF). Feasibility was assessed after 12 months of recruitment, reviewing change in weight and quadriceps strength (QS) at six weeks. Patient reported palatability was assessed using a scale of 1–10 (very poor - exceptional) and adherence (percentage intake). Results reported as median, IQR.

Results: 23 eligible patients were recruited (10 CF, 13 non-CF), 4 patients were withdrawn due to medical complications; 10 randomised to intervention and 9 to standard care. Participant rated product palatability median score was 7 (range 3–8), with 90% consuming >75% of prescribed supplements and no adverse outcomes attributable to intervention. At six weeks there was reduced but non-significant QS losses (P=0.289) in the intervention arm (−12.1%, −21.15 - -1.15%), non-intervention (−22.4%, −27.5 - -18.0%), weight change was non-significant (P=0.902) between intervention (−1.85, −2.35- -0.98kg) and non-intervention (−1.52, −4- 1.80kg).

Conclusion: Preliminary data showed satisfactory recruitment, well tolerated and adhered to intervention and suggested HP supplementation may be associated with superior QS at six weeks. A suitably powered randomised controlled trial is feasible to assess the efficacy of HP supplementation to aid recovery post-LT.

Back to Top | Article Outline

LIVING KIDNEY DONOR OUTCOMES REPORTED IN RANDOMISED TRIALS AND OBSERVATIONAL STUDIES

HANSON Camilla1,2, SAUTENET Benedicte2,3, CRAIG Jonathan1,2, CHAPMAN Jeremy4, KNOLL Greg5,6, REESE Peter7,8, and TONG Allison1,2

1School of Public Health, University of Sydney, 2Centre for Kidney Research, The Children's Hospital at Westmead, Sydney, 3Department of Nephrology and Clinical Immunology, Tours Hospital, 4Centre for Transplant and Renal Research, Westmead Hospital, Sydney, 5Clinical Epidemiology, Ottawa Hospital Research Institute, 6Renal Transplantation, The Ottawa Hospital, 7Renal-Electrolyte and Hypertension Division, Perelman School of Medicine, University of Pennsylvania, 8Center for Clinical Epidemiology and Biostatistics, Perelman School of Medicine, University of Pennsylvania

Aims: Evidence on the outcomes for living kidney donors is needed to guide screening and assessment, decision-making, and self-management, but the extent to which patient-relevant outcomes are consistently reported is uncertain. We aimed to determine the characteristics and heterogeneity of outcomes reported in studies in adult living kidney donors.

Methods: Databases were searched from January 2011 to December 2015. Randomised trials and observational studies assessing outcomes of living kidney donation were included. All outcome domains and measurements were extracted, and the frequency and characteristics of the outcome domains and measures were evaluated.

Results: From 199 studies, 183 (93%) were observational and 99 (50%) followed donors for a maximum of 12 months. Overall, 102 outcome domains were reported, with a median of 9 per study (interquartile range [IQR] 6 to 14], with 43 (42%) clinical, 32 (31%) surrogate and 27 (26%) patient-reported. The five most commonly reported domains were: kidney function (109 [55%]), time to discharge (79 [40%], blood loss/transfusion (62 [31%]), operative time (62 [31%]) and pain (56 [28%]). Quality of life (15%), mortality (15%), diabetes (8%), cardiovascular disease (7%), and end-stage kidney disease (6%) were reported infrequently. Kidney function and pain had 107 and 80 different outcome measures, respectively.

Conclusions: Outcomes were very heterogeneous, and frequently focused on kidney function and surgical-complications. Longer-term and donor-reported outcomes including mortality, diabetes, cardiovascular disease, quality of life, psychological impact, and time to recovery, were infrequent. Consistent reporting of outcomes that are important, particularly to donors and clinicians, will increase evidence-informed decision-making.

Back to Top | Article Outline

COMPARATIVE SURVIVAL BENEFITS OF TRANSPLANTATION AND DIALYSIS IN PATIENTS WITH AND WITHOUT DIABETES MELLITUS

SHINGDE Rashmi1, CALISA Vaishnavi1, CRAIG Jonathan2, CHAPMAN Jeremy3, and WONG Germaine2,3

1Centre for Kidney Research, The Children's Hospital at Westmead, Sydney, 2School of Public Health, University of Sydney, 3Centre for Transplant and Renal Research, Westmead Hospital, Sydney

Aim: To compare the survival outcomes among patients with pre-existing diabetes mellitus (type I and II) listed on the transplant waiting list with receiving a kidney transplant.

Methods: We compared short and long-term overall survival among listed dialysis patients and transplant recipients, stratified by diabetic status, and used adjusted cox proportional hazard regression modeling to determine the risk factors for death after transplantation.

Results: Over a median follow-up period of 8 years, 3437 (64.6%) and 187 (3.5%) recipients received a kidney alone and kidney-pancreas transplant, and 1647 (30.9%) remained listed on dialysis. Of those with type 1 diabetes, 181 (73.9%) patients received a simultaneous kidney-pancreas transplant. The one, five and eight-year survival among transplant recipients without diabetes were 98.13%, 92.98% and 88.66% respectively. The respective survival rates for recipients with pre-existing type I and type II diabetes mellitus were 96.52%, 87.57%,82.15%; and 95.07%, 78.68%, 69.08%. Amongst listed dialysis patients, the respective one, five and eight-year survival for non-diabetic patients was 96.14%, 77.47%, 61.92%, followed by 92.06%, 40.66%, 27.41% for patients with type 1 diabetes and 95.63%, 57.55% and 40.05% for those with type 2 diabetes. (Figure 1). The strongest predictors for death among type I diabetic transplant recipients were having co-existing cardiovascular vascular disease [adjusted HR: 6.2 (95%CI 2.0-19.4, p = 0.002)], and chronic lung disease in type II diabetic recipients [adjusted HR: 2.4 (95%CI: 1.0-5.6 p = 0.045)].

Conclusion: Incremental survival gains post-transplantation are substantial for patients with end-stage kidney disease, with greatest survival benefits in type I diabetes mellitus patients.

FIGURE 1

FIGURE 1

Back to Top | Article Outline

OUTCOMES OF RENAL TRANSPLANTATION IN QUEENSLAND CHILDREN 1970–2015

LALJI R1,2, FRANCIS A2, and BURKE J2

1Department of Renal Medicine, Great Ormond Street Hospital for Children, 2Department of Renal Medicine, Lady Cilento Children's Hospital

Aims: To report the outcomes of renal transplantation in Queensland children over time.

Methods: Data were collected on all transplant recipients in a single paediatric nephrology centre (1970–2015). Graft and patient survival were calculated using Kaplan Meier analysis.

Results: In total, 179 children received 188 kidney transplants; 71 (37.7%) were living-related donations and 3 combined kidney/liver transplants. Median age at transplantation was 12 (interquartile range 7–15 years). 81 patients (45.2%) had CAKUT or reflux nephropathy as the primary cause for ESRF. There was a marked increase in numbers of transplants over time, with 58.5% of total transplants and 72.4% of transplants in infants and young children (0-5years) performed from the year 2000 onwards. Nine children required a second transplant before transfer to an adult unit, 3 (33%) due to early surgical complications and 2 (22%) due to acute rejection early in the transplant program. The overall patient survival at 5, 10, and 20 years was 97% (95% CI +/− 2.6), 93.6% (95% CI +/− 4.1) and 88.5% (95% CI +/− 6.3). Graft survival at 5, 10 and 20 years was 89.1% (95% CI +/− 4.7), 74.1% (95% CI +/− 7.6) and 38.2% (95% CI +/− 10.9). Chronic rejection was the predominant reason for graft loss in 47 patients (66.2%), but notably 1 patient lost their graft and life to JC virus.

Conclusions: Transplants rates in children have increased over time with overall patient and graft survival comparable to large international centres.

FIGURE 1

FIGURE 1

Back to Top | Article Outline

PRE- AND POST-DONATION KIDNEY FUNCTION IN LIVE DONORS IN THE AUSTRALIAN PAIRED KIDNEY EXCHANGE (AKX) PROGRAM

CHEN Jenny, LUXTON Grant, WOODROFFE Claudia, and FERRARI Paolo

Department of Nephrology, Prince of Wales Hospital, Sydney

Aims: Baseline pre-donation eGFR appears to predict the risk of post-donation chronic kidney disease. CARI guidelines recommend to not accept kidneys from donors with GFR <80ml/min/1.73m2. In the AKX program, all donors with a raw nuclear GFR (nGFR) >80ml/min are deemed suitable for donation, however the significance of this selection indicator is unknown.

Methods: We analysed pre- and post-donation data of 129 live donors in the AKX program with at least 1 year follow-up linking records in the AKX database and ANZDATA.

Results: There were 73 male and 56 female donors; mean (±SD) age was 53±11years, BMI 26.6±3.4kg/m2 and blood pressure 123/75±11/7mmHg. Twenty-three (23) donors had BMI>30 kg/m2, 22 donors had controlled hypertension and 1 donor had impaired glucose tolerance.

Pre-donation serum creatinine was 73±13μmol/l, eGFR (by CKD-EPI) 90±15ml/min/1.73m2, raw eGFR (not corrected for BSA) 99±17ml/min and nGFR 108±17ml/min. Thirty-one (31) donors had a baseline eGFR <80ml/min/1.73m2, but of these 16 had a raw eGFR ≥80ml/min. At 1 year post-donation serum creatinine was 101±21μmol/l, eGFR 63±13ml/min/1.73m2; 16 donors had a creatinine >130μmol/l and 55 had eGFR <60ml/min/1.73m2, including 29 donors with baseline eGFR <80ml/min/1.73m2 (post-donation range 42–59). The percentage change in GFR at 1 year was −30±9% and did not depend on baseline eGFR, nGFR, gender, or BMI, but was related to age (P<0.05) in multivariate analysis. Projected post-donation eGFR was 76.6% of pre-donation for a 20 year-old and 66.1% for a 70 year-old donor.

Conclusions: Approximately 25% of AKX donors would have been excluded from live kidney donation using CKD-EPI eGFR. Post-donation eGFR at 1 year is 70% of pre-donation and appears to reduce by 2.1% for each decade above the age of 30 years, indicating reduced renal reserve capacity with increasing donor age. Long-term outcome data on AKX donors with low eGFR will need careful monitoring.

Back to Top | Article Outline

IDENTIFYING CONSENSUS-BASED PRIORITY OUTCOME DOMAINS FOR TRIALS IN KIDNEY TRANSPLANTATION: A MULTINATIONAL DELPHI SURVEY WITH PATIENTS, CAREGIVERS, AND HEALTH PROFESSIONALS

TONG Allison1,2, SAUTENET Benedicte2,1, MANERA Karine2, CHAPMAN Jeremy3, WARRENS Anthony4, ROSENBLOOM David5, WONG Germaine3, GILL John6, BUDDE Klemens7, ROSTAING Lionel8, MARSON Lorna9, JOSEPHSON Michelle10, REESE Peter11, PRUETT Timothy12, HANSON Camilla2, O'DONOGHUE Donal13, TAM-THAM Helen14, HALIMI Jean-Michel15, SHEN Jenny16, KANELLIS John17, SCANDLING John18, HOWARD Kirsten2, HOWELL Martin2, CROSS Nick19, EVANGELIDIS Nicole2, MASSON Philip20, OBERBAUER Rainer21, FUNG Samuel22, JESUDASON Shilpa23, KNIGHT Simon24, MANDAYAM Sreedhar25, MCDONALD Stephen23, CHADBAN Steven26, RAJAN Tasleem27, and CRAIG Jonathan2

1Centre for Kidney Research, The Children's Hospital at Westmead, Sydney, 2School of Public Health, University of Sydney, 3Centre for Transplant and Renal Research, Westmead Hospital, 4School of Medicine and Dentistry, Queen Mary University of London, 5ESRD Network 18, ESRD Network 18, 6Division of Nephrology, University of British Columbia, 7Department of Nephrology, Charité - Universitätsmedizin Berlin, 8Department of Nephrology, Dialysis and Organ Transplantation, Centre Hospitalier Universitaire Rangueil, 9Transplant Unit, University of Edinburgh, 10Department of Medicine, The University of Chicago, Chicago, 11Renal Division, University of Pennsylvania Perelman School of Medicine, 12Department of Surgery, University of Minnesota, 13Department of Renal Medicine, Salford Royal NHS Foundation Trust, 14Departments of Medicine and Community Health Sciences, Libin Cardiovascular Institute and O’Brien Institute of Public Health, University of Calgary, 15Department of Nephrology and Clinical Immunology, Tours Hospital, 16Department of Nephrology, Los Angeles Biomedical Research Insitute at Harbor-UCLA Medical Center, 17Department of Nephrology, Monash Medical Centre, Melbourne, 18Department of Medicine, Stanford University School of Medicine, 19, 20Department of Renal Medicine, Royal Infirmary of Edinburgh, 21Department of Internal Medicine, Division of Nephrology, University of Vienna, 22Jockey Club Nephrology & Urology Centre, Princess Margaret Hospital, 23Central and Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital, 24Centre for Evidence in Transplantation, Nuffield Department of Surgical Sciences, University of Oxford, 25Selzman Institute for Kidney Health, Section of Nephrology, Baylor College of Medicine, 26Department of Nephrology, Royal Prince Alfred Hospital, Sydney, 27Department Community Health Sciences, University of Calgary

Aims: To generate a consensus-based set of core outcome domains in kidney transplantation based on the shared priorities of patients/caregivers and health professionals.

Methods: In a 3-round Delphi survey, patients/caregivers and health professionals rated the importance of outcome domains for kidney transplantation trials on a 9-point Likert scale and provided free-text comments. During Round 2 and 3, participants re-rated the outcomes after reviewing their own score, the distribution of the respondents’ scores, and all comments. For each outcome, the median, mean, and proportion rating 7–9 (critically important) were calculated.

Results: 1018 participants (461 [45%] patients/caregivers and 557 [55%] health professionals) from 79 countries completed Round 1, and 779 (77%) completed Round 3. The top eight prioritised outcomes that met the consensus criteria in Round 3 (defined as mean ≥7.5, median ≥8 and proportion rating 7–9 >85%) in both stakeholder groups were graft loss, graft function, chronic graft rejection, acute graft rejection, mortality, infection, cancer (excluding skin) and cardiovascular disease. (Figure 1) Compared with health professionals, patients/caregivers gave higher priority to six outcomes (mean difference of 0.5 or more): skin cancer, surgical complications, cognition, blood pressure, depression, and ability to work.

Conclusions: Graft complications and severe comorbidities were consistently highly prioritised by both stakeholder groups. Psychosocial outcomes were rated of relatively lower importance, as they may be implicit in graft and clinical complications. The consensus-based priority outcomes will inform the development of a core outcome set to improve the consistency and relevance of outcomes reported in trials in kidney transplantation.

FIGURE 1

FIGURE 1

Back to Top | Article Outline

THE COMORBIDITY BURDEN OF POTENTIAL, INTENDED AND ACTUAL DECEASED LIVER DONORS IN NSW

HORN Ryan1, WEBSTER Angela1,2, KELLY Patrick1, O'LEARY Michael3,4, HEDLEY James1, ROSALES Brenda1, SHACKEL Nicholas5,6, THOMSON Imogen7, and WYBURN Kate7,8

1School of Public Health, University of Sydney, 2Centre for Kidney Research, Westmead Hospital, Sydney, 3NSW Organ and Tissue Donation Service, 4Intensive Care Service, Royal Prince Alfred Hospital, Sydney, 5Department of Medicine, University of New South Wales, Sydney, 6Department of Gastroenterology, Ingham Institute and Liverpool Hospital, 7Sydney Medical School, University of Sydney, 8Renal Unit, Royal Prince Alfred Hospital, Sydney

Aims: The increasing age of organ donors in Australia likely corresponds with an increased comorbidity burden. We aimed to describe the burden of comorbidities on the potential, intended and actual deceased liver donor populations in NSW.

Methods: The NSW Organ and Tissue Donation Service referral logs from January 2014 to December 2015 were reviewed. We examined all referrals and identified comorbidities including cancer, infection, cardiovascular disease, respiratory disease, diabetes, hypertension, hyperlipidaemia, dementia, peptic ulcer disease, connective tissue disease, chronic kidney disease, chronic liver disease, congenital heart disease and high-risk behaviours. Proportions of comorbidity burdens between donor groups were compared by number of comorbidities and Charlson index.

Results: Of the 1,257 referrals examined, 714 were potential donors, 428 were deemed intended donors and 115 became actual liver donors. Of the actual liver donor population 61(53%) had five or more comorbidities compared to 94(22%) of the intended and 131(18%) in the potential liver donor populations. Among actual donors only 1(1%) had a Charlson index of five or more compared to 15(4%) in the intended and 67(9%) in the potential donor populations.

Conclusions: In NSW, liver donation did not appear to be negatively influenced by the number of donor comorbidities in the period reviewed. Interestingly, actual donors were less likely to have a higher Charlson index, suggesting that some comorbidities have a more significant impact than others. Further assessment of the impact of comorbidities on the progression to actual donation and transplant outcomes may inform future donor selection criteria.

TABLE 1

TABLE 1

Back to Top | Article Outline

DECEASED DONOR RENAL TRANSPLANTATION IN A PATIENT WITH A NOVEL COMPLEMENT FACTOR H MUTATION AND BACKGROUND ATYPICAL HAEMOLYTIC URAEMIC SYNDROME (AHUS)

SINGER Julian1, ROXBURGH Sarah1,2, WARD Christoper3,2, COOPER Bruce1,2, and MCGINN Stella1,4

1Department of Renal Medicine, Royal North Shore Hospital, Sydney, 2Department of Medicine, University of Sydney, 3Department of Haematology, Royal North Shore Hospital, Sydney, 4Renal Transplant Unit, Australian Paired Kidney Exchange Programme

Background: In 2012, a 41year old lady developed acute kidney injury, and microangiopathic haemolytic anaemia following sinusitis. aHUS was confirmed with thrombotic microangiopathy present on renal biopsy, a normal serum ADAMTS13, and the absence of Shiga-toxin producing E.Coli. Despite treatment with plasma exchange, corticosteroids, and eculizumab she required ongoing dialysis. A novel mutation in the complement factor H gene (1106G>A) was later defined. Patients diagnosed with factor H mutation aHUS have an increased risk of renal allograft loss compared to other mutations. Hence prophylactic eculizimab has been advocated for transplantation but access to this in Australia is difficult.

Case Report: In January 2017 the patient received a deceased donor renal transplant with a 1/6 HLA mismatch (DR1), and a single DSA (DQ7, MFI of 2190). ATG induction was commenced along with tacrolimus, mycophenolate and prednisone. Access to perioperative eculizimab was denied through the Pharmacological Benefits Scheme. Serial haemolytic screens remained negative. In the presence of delayed graft function and ongoing risks of aHUS, a renal biopsy (day 7) demonstrated acute tubular injury, no evidence of thrombotic microangiopathy or rejection. The patient’s urine output continues to improve and no eculizimab has been used.

Conclusion: This case adds to the evolving knowledge concerning renal transplant outcomes in patients with aHUS. Whilst eculizumab remains a treatment option, its role and the timing of initiation in transplantation is yet to be clearly defined.

Back to Top | Article Outline

Cells (Including Islets) and Xenotransplantation

IMPACT OF MESENCHYMAL STEM CELLS ON AN OVINE MODEL OF KIDNEY TRANSPLANTATION

LETT Bron1,2, SIVINATHAN Kisha3, JOHNSTON Julie3, AUCLAIR Dyan4, RUSSELL Christine H4, OLAKKENGIL Santosh4, PERUMAL Raj5, DROGEMULLER Chris4,6, and COATES Patrick T6,7,8

1Discipline of Medicine, University of Adelaide, 2Royal Adelaide Hospital, Center for Clinical and experimental transplantation, 3Center for clinical and experimental transplantation, Royal Adelaide Hospital, 4Renal Department, Royal Adelaide Hospital, 5Large Animal Research and Imagining Facility, 6Center for Clinical and Experimental Transplantation, Royal Adelaide Hospital, 7School of Medicine, University of Adelaide, 8Renal Deprtment, Royal Adelaide Hospital

Aims: The aims of this study were to examine the migration pattern of Mesenchymal stem cells (MSCs) in an ovine model of kidney autotransplantation. This information was then used to guide the application, and to examine the impact, of MSCs in an Ovine model of kidney allotransplantation.

Methods: Sheep underwent heterotopic autotransplantation of the left kidney into the neck, connecting to the jugular vein and carotid artery. Self derived MSCs were labeled with an iron nano particles and given either directly into the graft or systemically into a central venous line. These cells were then tracked using an MRI and histology.

Further sheep then underwent kidney allotransplantation with some receiving MSCs directly into the graft. These were then tracked for two weeks looking at serum creatinine and urea.

Results: From comparing the MRI images of the local vs systemic injection MSCs, it was apparent that significantly more cells were retained in the graft when given locally.

However, when given directly into the kidney graft at a dose of 1x106 cells/kg, MSCs did not have an impact on serum creatinine and urea levels (Figure).

Conclusions: When dosing with MSCs in solid organ transplantation, direct injection of the cells provides much greater recruitment to the graft site than relying on the migratory abilities of the MSCs. However, even when given directly into the graft at a dose of 1x106 cells/kg the MSCs did not protect against rejection.

Figure

Figure

Back to Top | Article Outline

SUCCESSFUL AUTO ISLET TRANSPLANTATION FOLLOWING TOTAL PANCREATECTOMY FOR CHRONIC PANCREATITIS: THE WESTMEAD EXPERIENCE

HAWTHORNE Wayne J1,2,3, CHEW Yi Vee3, WILLIAMS Lindy3, BURNS Heather3, HOLMES-WALKER Jane4, ROGERS Natasha3, O'CONNELL Phil3,5, and PLEASS Henry1,2,5

1Western Clinical School, University of Sydney, 2Department of Surgery, Westmead Hospital, Sydney, 3Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 4Department of Endocrine and Metabolism, Westmead Hospital, Sydney, 5National Pancreas Transplant Unit, Westmead Hospital, Sydney

Aims: Chronic pancreatitis is a debilitating disease where patients suffer severe abdominal pain impossible to relieve through conventional methods such as analgesia. Chronic inflammation results in fibrosis and loss of pancreatic function. Total pancreatectomy remains the only treatment capable of complete abrogation of pain, despite the technical difficulty and resultant exocrine insufficiency (including diabetes). By combining total pancreatectomy with auto islet transplantation, it is possible to preserve diabetic control in these patients.

Methods: Three patients underwent surgical removal of their pancreas; it was digested using collagenase and Neutral Protease (SERVA). Islet yield was quantified and microbial analysis undertaken before infusion into the patient via percutaneous Portal Venous reflux. Patients receiving autologous transplant were monitored after surgery for pain levels, C-peptide production and exogenous insulin requirement to determine if relief of pain was successful and if islet function had been preserved.

Results: Total pancreatectomy was performed on three patients between 2010 and 2016. Pancreata were severely fibrotic but 74.9-85.3% digestion of the fibrotic pancreas was achieved. Final packed cell volumes were 3-4ml, with purity of 60-95% and viability of 90-100%. All three patients were transplanted, receiving on average 4,016 IEQ/kg body weight. All patients had significant C-peptide production (mean = 0.43 +/− 0.12 nmol/L) following islet cell engraftment and a resolution to their significant pancreatitis pain.

Conclusions: In these three cases, total pancreatectomy followed by islet autotransplant was capable of relieving pain symptomatic of chronic pancreatitis. However, severity of pancreatic fibrosis and previous surgical interventions were found to influence islet isolation yield and therefore the functional outcome of islet auotransplantation.

Back to Top | Article Outline

DECONTAMINATION REDUCES MICROBIOLOGICAL CONTAMINATION OF PANCREATA UTILISED FOR ISLET CELL TRANSPLANTATION

SHAHRESTANI Sara1,2, GOIRE Namraj1, WILLIAMS Lindy3, CHEW Yi Vee3, DAVIES Sussan4, ROBERTSON Paul5, O'CONNELL Phil2, PLEASS Henry6, and HAWTHORNE Wayne2,6

1School of Medicine, University of Sydney, 2Centre for Transplant and Renal Research, The Westmead Institute, 3Centre for Transplant and Renal Research, 4School of Medicine, The Westmead Institute, 5Renal Transplant Unit, Westmead Hospital, Sydney, 6Department of Surgery, Westmead Hospital, Sydney

Aim: Bacterial and fungal contamination of pancreases retrieved for islet cell transplantation pose a barrier to successful culture and subsequent transplantation. The aims of the present study were to identify the potential bacterial and fungal contaminants on pancreata retrieved for islet cell isolation, culture and transplantation and assess the efficacy of implementation of decontamination.

Methods: We performed a retrospective review of all islet cell isolations conducted at Westmead Hospital from July 1997 to June 2016. We compared standard microbiological culture to BACTEC methods and identified organisms based on contamination sources (i.e. skin flora, enteric, respiratory/oral or environmental). We examined rates of positive culture after the decontamination procedure and looked at the factors that contributed.

Results: 248 cultures of donor organ perfusion media were performed of which 172 utilised standard culture and 103 BACTEC. BACTEC culture methods demonstrated superior sensitivity (84%) compared to standard culture (14.5%) in detecting contamination. In the BACTEC positive culture, the majority of contamination sources originated from skin flora (62%), followed by enteric sources (24%). We achieved a significant reduction in the positive cultures following decontaminated with only 9% of cultures remaining positive, and these were all from excessively fatty pancreata. In those that were not decontaminated 28% remained positive after islet processing.

Conclusions: BACTEC culture methods are superior for detecting contamination in organ retrieval for transplantation. Decontamination provides an effective means of reducing the burden of contamination of organs. Factors contributing to resistance to decontamination include excessively heavy fat coverage of the pancreas.

Back to Top | Article Outline

STANDARDIZING WHOLE-BLOOD IMMUNOPHENOTYPING PANELS ON FLOW CYTOMETRY FOR TRANSPLANT PATIENTS AND CLINICAL TRIALS AT WESTMEAD

JIMENEZ VERA Elvira1, CHEW Yi Vee1, BURNS Heather1, ANDERSON Patricia1, WILLIAMS Lindy1, DERVISH Suat2, WANG Xin Maggie2, YI Shounan1, HAWTHORNE Wayne1, ALEXANDER Stephen3, O'CONNELL Philip1, and HU Min1

1Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 2Flow Cytometry Core Facility, Westmead Millennium Institute, Westmead Hospital, Sydney, 3Centre for Kidney Research, The Children's Hospital at Westmead, Sydney

Aim: 1) Establish whole-blood immunophenotyping analysis for transplant patients and clinical trial, 2) Standardise reagents, sample handling, instrument setup and data analysis.

Methods: Absolute cell count (TruCount) and seven leukocyte-profiling panel is containing 8–10 marker-antigens (37 unique) were used for monitoring immune profiles consisting of Tregs, NKT, B, NK, DCs, and monocyte subsets. Samples were acquired on a BD-LSRFortessa and Flowjo was used for data analysis. BD™ Cytometer Setup and Tracking beads monitored cytometer performance. 100–300 μl whole-blood was used for each panel.

Results: We titrated antibodies and calculated the staining index (SI) (CD45BUV395 in Fig.1A) for panel optimisation. Application settings on a BD LSRFortessa, measurement of the spillover spreading matrix (SSM) for each panel [SSM in Panel 3 (Tab.1)], and antibody quantity for the panel-cocktail were established. Auto-analysis templates and gating strategies (Panel 1 in Fig.1B) to target subsets of immune-cells were set up and blood sample collection, antibody cocktails, and staining protocols were standardised. Staining was performed within 2 hours of blood-sample collection. 2 paediatric-kidney-transplant patients, one islet-transplant patient (4 time-points), 7 T1D patients and 8 control-samples have been evaluated. The ability to identify consistent immune subsets across all panels over 6 months (Fig.1C from TruCount Panel) was achieved, as was used to longitudinally track the proportions in transplant patients (Fig.1D Panel 1).

Conclusion: We standardised immune-panels and procedures for absolute cell numbers and multiple subsets of immune cells for monitoring transplant patients including pediatric patients and clinical trial.

Figure

Figure

TABLE 1

TABLE 1

Back to Top | Article Outline

GENETICALLY MODIFIED PORCINE NEONATAL ISLET XENOGRAFTS FUNCTION LONG-TERM IN BABOONS

HAWTHORNE Wayne J1,2,3, HAWKES Joanne3, SALVARIS Evelyn4, BURNS Heather3, CHEW Yi Vee3, AYOUBI Ali3, BARLOW Helen4, BRADY Jamie5, LEW Andrew5, NOTTLE Mark6, O'CONNELL Phil3,1, and COWAN Peter J4

1Western Clinical School, University of Sydney, 2Department of Surgery, Westmead Hospital, Sydney, 3Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 4Immunology Research Centre, St Vincent's Hospital, Melbourne, 5Walter and Eliza Hall Institute of Medical Research, Melbourne, 6Centre for Reproductive Health, University of Adelaide

Aim: Few studies have ever achieved long-term function of xenotransplanted Islet cells. We investigated the combined effects of transgenic expression of human complement regulatory proteins and deletion of αGal (GTKO) in protecting neonatal islet cell cluster (NICC) xenografts following intraportal transplantation in immunosuppressed baboons.

Method: 1–5 day old GTKO piglets transgenic for human CD55, CD59 and H-transferase were used as donors. Recipient baboons received GTKO/CD55-CD59-HT NICC under costimulation blockade-based immunosuppression (anti CD2, anti CD154, belatacept, tacrolimus; n=4). Graft survival and function was followed by daily blood sugar levels, IVGTT, OGTT and graft immunohistochemical analysis >600 days post-transplant.

Results: No GTKO/CD55-CD59-HT xenograft exhibited any sign of early thrombosis or infiltrate, nor change to recipient platelet counts, fibrinogen and D-dimer levels from baseline. Analysis of liver biopsies from recipients under costimulation blockade-based immunosuppression had no cellular infiltration, but NICC cells staining positive for insulin, glucagon and somatostatin were present in all GTKO/CD55-CD59-HT xenografts >12 months. On the extended protocol one animal has normal glucose handling out to >620 days post transplant.

Conclusions: Deletion of αGal and expression of human CD55 and CD59 prevent early thrombotic destruction of porcine NICCs in the baboon model. Costimulation blockade-based immunosuppression is an effective immunosuppression protocol in prolonging the survival of genetically modified porcine NICC xenografts.

Back to Top | Article Outline

ENDOTHELIAL CELL PROTECTION USING CORLINE HEPARIN CONJUGATE (CHC) IN PIG-TO-HUMAN IN VITRO XENOTRANSPLANTATION MODELS

BONGONI Anjan K1, SALVARIS Evelyn1, KLYMIUK Nikolai2, WOLF Eckhard2, AYARES David3, MAGNUSSON Peetra U4, and COWAN Peter J1,5

1Immunology Research Centre, St Vincent's Hospital, Melbourne, 2Institute of Molecular Animal Breeding and Biotechnology, Ludwig-Maximilian University, Munich, Germany, 3Revivicor Inc., Blacksburg, VA, USA, 4Department of Immunology, Genetics and Pathology, Uppsala University, Sweden, 5Department of Medicine, University of Melbourne

Background: Corline Heparin Conjugate (CHC), a preassembled compound of multiple (>20) unfractionated heparin chains, coats cells with a glycocalyx-like layer and may therefore inhibit (xeno)transplant-associated activation of the plasma cascades and inflammation.

Aim: To investigate the use of CHC to protect pig aortic endothelial cells (PAEC) in pig-to-human in vitro xenotransplantation settings.

Method: PAEC from wild-type (WT) and genetically modified (GTKO.CD46.hTBM) pigs were coated with CHC (100 μg/ml) and incubated with 10% normal human plasma for 4 hrs to test the effect of CHC on complement-induced endothelial damage. To investigate the effect of CHC on hemocompatibility, PAEC were grown on microcarrier beads, coated with CHC under starvation conditions, and incubated with non-anticoagulated whole human blood.

Results: Treatment of cells with plasma induced complement C3b/c and C5b-9 deposition and loss of heparan sulfate in WT and GTKO.hCD46.hTBM PAEC, although this was statistically significant only in the former. CHC coating of PAEC provided improved protection against plasma-induced complement deposition and loss of heparan sulfate. Genetically modified PAEC significantly prolonged clotting time of human blood (115.0±16.1 min, p<0.001) compared to WT PAEC (34.0±8.2min). Surface CHC significantly improved the human blood compatibility of both types of PAEC, as shown by increased clotting time (WT: 84.3±11.3 min, p<0.001; GTKO.hCD46.hTBM: 146.2±20.4 min, p<0.05) and reduced markers of platelet adhesion, complement activation (C5a and sC5b-9), coagulation activation (TAT complex) and inhibition of fibrinolysis (tPA/PAI-1).

Conclusion: Surface immobilization of CHC on PAEC substitutes for damaged glycocalyx and protects the cells against xenotransplantation-induced coagulation and inflammation in vitro.

Back to Top | Article Outline

DONOR P2RX7 GENOTYPE DOES NOT AFFECT THE DEVELOPMENT OF GRAFT-VERSUS-HOST DISEASE IN HUMANISED MICE

ADHIKARY Sam1,2,3, GERAGHTY Nicholas1,2,3, SLUYTER Ronald1,2,3, and WATSON Debbie1,2,3

1School of Biological Sciences, University of Wollongong, 2Other, Centre for Medical and Molecular Bioscience, 3Other, Illawarra Health and Medical Research Institute

The P2X7 receptor is important in the inflammatory immune response and is implicated in the development of graft-versus-host disease (GVHD). Single nucleotide polymorphisms (SNPs) in the P2RX7 gene can increase (gain-of-function; GOF) or decrease (loss-of-function; LOF) P2X7 activity on human leukocytes, however their role in the development of GVHD is unknown.

Aim: To investigate the effect of P2RX7 genotype on human immune responses, and on the development of GVHD in a humanised mouse model.

Method: P2X7 activity was measured on human peripheral blood mononuclear cells (hPBMCs) using a flow cytometric ATP-induced cation uptake assay. Donor P2RX7 genotype was determined by genomic DNA sequencing. NOD-SCID-IL2Rγnull (NSG) mice were injected (i.p.) with 10x106 hPBMCs from either GOF or LOF P2RX7 genotype donors. Humanised mice were monitored for clinical signs of GVHD for 10 weeks, with hPBMC engraftment examined at 3 weeks post injection (blood) and at end-point (spleen) by flow cytometry. Results: In 23 human donors seven known P2RX7 SNPs (including two GOF, three LOF and one neutral) were identified. Donor P2RX7 genotype correlated with P2X7 activity in CD3+, CD4+ and CD8+ T cells, and did not affect the engraftment of hPBMCs (predominantly T cells) in NSG mice at 3 weeks or end-point. Donor P2RX7 genotype did not affect weight loss, survival or GVHD clinical score in mice (P>0.05; n=3 studies; 2 donors per study).

Conclusion: This preliminary data suggests that donor P2RX7 genotype does not affect the development of GVHD in a humanised mouse model, however more donors will be examined.

Back to Top | Article Outline

Immunobiology, Tolerance and Treg

CIRCULATING MUCOSAL-ASSOCIATED INVARIANT T CELLS HAVE AN ENHANCED PRO-INFLAMMATORY CYTOKINE RESPONSE AFTER LUNG TRANSPLANTATION

SINCLAIR Kenneth Andrew1, YERKOVICH Stephanie Terase1,2, and CHAMBERS Daniel Charles1,2

1Lung Transplant Service, Prince Charles Hospital, Brisbane, 2School of Medicine, University of Queensland, Brisbane

Aims: Mucosal-associated invariant T (MAIT) cells are a T cell subset primarily responsible for controlling bacterial pathogens. The effective clearance of bacterial pathogens is critical for conserving an effective pulmonary epithelial barrier and maintaining the integrity of the lung allograft. Despite this, MAIT cell prevalence and function is uncharacterized following lung transplantation. Therefore, the aim of this study was to enumerate and assess the functional capacity of circulating MAIT cells from lung transplant recipients.

Methods: Peripheral blood mononuclear cells (PBMC) were obtained from lung transplant recipients and healthy control volunteers. PBMCs were stimulated for 4 hours with phorbol ester, ionomycin and brefeldin A. Flow cytometry was used to identify MAIT cells (defined as CD3+CD8+CD161++TCRVα7.2+) and expression levels of TNFα, IFNγ and granzyme B. Data is expressed as median (interquartile range).

Results: 17 lung transplant recipients (age = 47.7 (39.9–59.7) years, 7 (50%) female, 33.4 (12.3–47.5) months post-transplant and 10 healthy controls (6 (60%) female) were studied. There was no difference in the prevalence of circulating MAIT cells between lung transplant recipients (2.2% (1.9-3.8)) and controls (3.1% (1.6-3.9)). However, MAIT cells from lung transplant recipients expressed increased levels of IFNγ (p<0.01) and TNFα (p = 0.05) compared to controls (Figure 1A-B). There was no difference in granzyme B expression (Figure 1C).

Conclusion: While the proportion of circulating MAIT cells is not altered post-lung transplant, these MAIT cells do express a heightened pro-inflammatory cytokine profile. The impact of MAIT cell overactivation on the lung allograft remains unknown and requires further investigation.

Figure

Figure

Back to Top | Article Outline

A NOVEL MOLECULE HELMINTH-DERIVED PEPTIDE MODULATES THE IMMUNOGENIC PROFILE FOLLOWING ISLET CELL TRANSPLANTATION

BURNS Heather1, WONG Mary1, LIUWANTARA David1, CHEW YiVee1, O'BRIEN Bronwyn2, DALTON John3, DONNELLY Sheila4, and HAWTHORNE Wayne J5,6,1

1Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 2School of Life Sciences, University of Technology, Sydney, 3School of Biological Sciences, Queen's University, Belfast, 4University of Technology, Sydney, 5Western Clinical School, University of Sydney, 6Department of Surgery, Westmead Hospital, Sydney

Aims: The holy-grail of transplantation is to avoid toxic immunosuppression. We have identified naturally derived F. hepatica (FhES) and Helminth Defence Molecule (FhHDM) that target and modulate the activity of antigen-presenting cells, allowing potent immunomodulatory activity. We aimed to prove its efficacy in a murine islet transplant model.

Methods: Freshly isolated islets from BALB/c mice were transplanted under the kidney capsule of fully mismatched diabetic C57BL/6 mice. Recipient mice were administered a daily dose for 16 days of 50ug FhHDM or 50ug FhES or saline. Blood glucose levels were monitored daily, normoglycaemia was defined as BSL <10 mmol/L, Islet grafts had rejected when BSL >20 mmol/L, confirmed by histological analysis of the graft and by FACs analysis of the peripheral blood.

Results: FACs analysis of the peripheral blood showed a significant drop in lymphocyte population from baseline to cull in the control group (p < 0.005), however this was not seen in the mice treated with FhHDM or FhES. T cell populations dropped significantly between baseline and cull in all three treatment groups (p < 0.005). However, the proportion of CD4+FoxP3+ T cells was significantly higher in the FhHDM and FhES cohorts (p < 0.005). Despite this there was no extension to graft survival and histological analysis showed cellular infiltration in all three cohorts.

Conclusions: While FhHDM and FhES alone did not prevent rejection of islet grafts in this model it did influence the immunogenic profile of treated mice, evidence that these treatments could be used to establish an anti-inflammatory/regulatory environment to prevent islet rejection.

Figure

Figure

Back to Top | Article Outline

ASSESSMENT OF REGULATORY T CELL VIABILITY IN 3D BIOPRINTED STRUCTURES: A POTENTIAL APPLICATION FOR ISLET TRANSPLANTATION

KIM Juewan1, KANG Kyungwon2, SIVANATHAN Kisha N2, ROJAS-CANALES Darling3,2, HOPE Christopher4, YUE Zhilian5, LIU Xiao5, DROGEMULLER Christopher3,2, WALLACE Gordon G5, CARROLL Robert3,2, BARRY Simon C6, and COATES Patrick T3,2

1School of Biological Sciences, Faculty of Sciences, University of Adelaide, 2School of Medicine, Faculty of Health and Medical Sciences, University of Adelaide, 3Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital, 4University Department of Paediatrics, University of Adelaide, Women's and Children's Hospital, 5ARC Centre of Excellence for Electromaterials Science, Intelligent Polymer Research Institute, Innovation Campus, University of Wollongong, 6Molecular Immunology, Robinson Research Institute, University of Adelaide

Background: 3D bioprinting is an innovative technology that allows for the rapid and precise fabrication of complex 3D architectures with spatial orientation of cells within the biomaterial. 3D bioprinting of regulatory T-cells (Tregs) with islets has the potential to provide solutions for current obstacles in islet transplantation associated with immunosuppression.

Aim: To assess the effect of bioprinting on the viability of T cells.

Method: CD4+ T cells were purified from human peripheral blood mononuclear cells (PBMC) by negative selection. These cells were ‘bioprinted’ in a bulk structure using an alginate-gelatin hydrogel which was cross-linked with CaCl2 (2% w/v). Fluorescein-diacetate (FDA) and propidium iodide (PI) staining was used to assess cell viability 24 hours later via microscope visualisation and flow cytometry and compared to non-printed control.

Results: FDA/PI double staining of bioprinted PBMC and CD4+ T cells showed minimal differences in PI-TRITC positive cells (dead cells), when compared with controls via microscope analysis, whilst showing prominent viability with FDA-FITC staining. Flow cytometry on the other hand showed a slight 10% viability reduction of bio-printed cells to 71% (PBMC; average of triplicate, n=1) and 83% (CD4+ T cells; average of triplicate, n=1) when compared to their control (81% for PBMC and 93% for CD4+ T cell; n=1).

Conclusion: This preliminary data suggests that alginate-gelatin hydrogel bioprinting has a minimal effect on the viability of PBMC and CD4+ T cells and serves as a proof of principle that immune cells have the capacity to survive in hydrogel structures.

Back to Top | Article Outline

INCREASED RECRUITMENT OF HUMAN LYMPHOCYTE SUBSETS IN RENAL ALLOGRAFT REJECTION

KILDEY K1,2, KASSIANOS AJ1,2,3,4, LAW B1,2,3, WANG X1,2, SEE E5, JOHN GT6, WILKINSON R1,2,3,4, FRANCIS RS5, and HEALY H1,2

1Conjoint Kidney Research Laboratory, Pathology Queensland, 2Kidney Health Service, Royal Brisbane and Women’s Hospital, 3Institute of Health and Biomedical Innovation, Queensland University of Technology, 4School of Medicine, University of Queensland, 5Department of Nephrology, Princess Alexandra Hospital, Brisbane, 6Kidney Health Service, Royal Brisbane and Women's Hospital

Aim: To identify, enumerate and phenotype the lymphocyte subsets in renal allograft rejection.

Background: Lymphocytes are pivotal effectors in kidney transplant rejection. However, the respective contribution of different lymphocyte subsets in episodes of allograft rejection remains uncertain, with methods limited to immunohistochemical techniques that are not able to unequivocally define lymphocyte subsets. Thus, this study uses a multi-colour flow cytometric-based approach to evaluate the lymphocyte subsets in human allograft rejection.

Methods: We extracted renal lymphocytes from healthy kidney tissue and transplant biopsies stratified based on the histopathological diagnosis of (a) non-rejection, (b) T cell-mediated rejection (TCMR) (c) antibody-mediated rejection (ABMR) and (d) mixed (TCMR+ABMR) rejection. Lymphocyte subsets were characterised by fourteen-colour flow cytometry.

Results: We detected elevated numbers of leukocytes (CD45+) and specifically T cells (CD45+CD3+) in transplant biopsies compared with healthy kidney tissue. Within this T cell compartment, numbers of cytotoxic T cells (CD3+CD8+), γ/δ T cells (CD3+γ/δ+) and natural killer (NK)-T cells (CD3+CD16+) were significantly elevated in TCMR biopsies compared to healthy kidney tissue. Notably, NK-T cells were also significantly elevated compared to “non-rejection” transplant biopsies. Of CD3 lymphoid cells, B cells (CD3CD19+) and NK cells (CD3CD56+), in particular CD56bright NK cells, were elevated in transplant biopsies compared with healthy kidney tissue.

Conclusions: Collectively, our data demonstrate the feasibility of flow cytometry to characterise the immunophenotype in renal transplant tissue, and indicate that lymphocyte subsets are differentially recruited during rejection. Further dissection of these subsets may provide useful diagnostic and prognostic information for patients experiencing allograft rejection.

Back to Top | Article Outline

SELECTION OF XENOANTIGEN SPECIFIC REGULATORY T CELLS (TREG) BY CELL SURFACE MARKERS

LU Cao, HUANG Dandan, BURNS Heather, HU Min, HAWTHORNE Wayne, YI Shounan, and O'CONNELL Philip

Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney

Background: Strategies for immunomodulation of xenograft rejection response whilst minimizing long-term immunosuppression need be developed. Polyclonal Treg are suppressive in the xenogeneic responses, but it increasingly clear that antigen-specific Treg are more specific and potent. Our previous study has shown that xenoantigen stimulation enhanced human Treg suppressive capacity. However, whether xenoantigen expanded Treg express specific cell surface markers applicable for enriching xenoantigen-specific Treg needs be investigated. This study aimed to identify candidate cell surface markers specifically upregulated in xenoantigen expanded human Treg and investigate their potential for selection of xenoantigen-specific Treg.

Methods: After 7-days polyclonal expansion with anti-CD3/CD28 beads, human Treg were further expanded with two or three subsequent cycles of stimulation either with polyclonal beads or with irradiated porcine peripheral blood mononuclear cells (PBMC), respectively. Multiple flow cytometry gating strategy was used to evaluate expression of Treg activation/memory related markers to determined candidate cell surface markers for isolation of xenoantigen-specific Treg subset(s).

Results: After two and three cycles of both polyclonal (PlTreg) and xenoantigen (XnTreg) stimulation, despite no difference in expression of Treg activation markers CD39 and CD15s was detected between PlTreg and XnTreg, XnTreg expressed substantially upregulated Treg activation/memory markers HLA-DR and CD27 with a larger proportion of CD45RO+HLA-DR+CD27+ subset compared to PlTreg, suggesting a xenoantigen-specific Treg subset.

Conclusion: This study has targeted CD45RO, HLA-DR and CD27 as candidate cell surface markers for further selection of xenoantigen-specific Treg subset(s) to evaluate their efficacy in suppression of the xenogeneic responses in xenotransplantation.

Back to Top | Article Outline

THE PURINERGIC CD73/A2A RECEPTOR SIGNALLING AXIS PLAYS AN IMPORTANT ROLE IN A HUMANISED MOUSE MODEL OF GRAFT-VERSUS-HOST DISEASE

GERAGHTY Nicholas1,2,3, ADHIKARY Sam1,2,3, SLUYTER Ronald1,2,3, and WATSON Debbie1,2,3

1School of Biological Sciences, University of Wollongong, 2Centre for Medical and Molecular Biosciences, University of Wollongong, 3Illawarra Health and Medical Research Institute, University of Wollongong

The importance of the CD73/A2A receptor purinergic signalling axis has been demonstrated in allogeneic mouse models of graft-versus-host disease (GVHD). CD73 blockade worsened GVHD, whereas A2A activation delayed GVHD in these models.

Aim: To investigate the role of CD73/A2A signalling on GVHD development in a humanised mouse model.

Method: Immunodeficient NOD-SCID-IL2Rγnull (NSG) mice were injected intra-peritoneally (i.p.) with 10x106 human peripheral blood mononuclear cells (hPBMCs) (day 0) (2 donors). Humanised mice were subsequently injected i.p for 7 days with the CD73 antagonist APCP or 14 days with the A2A agonist CGS21680 (n = 10 per group). Humanised mice were monitored for weight loss and phenotypic signs of GVHD. Engraftment of hPBMCs and immune subsets were examined by flow cytometry.

Results: Injection of CD73 antagonist or A2A agonist did not alter hPBMC engraftment or immune subsets in NSG mice, in blood at three weeks post-hPBMC injection (blood) or at the time of euthanasia (spleen). CD73 significantly worsened weight loss (P = 0.0350) but did not impact clinical score or survival in humanised mice. A2A activation significantly delayed mortality, extending median survival time from 34 days to 49 days (P = 0.0335), but did not impact weight loss (P = 0.3025) or clinical score (P = 0.1815) in humanised mice.

Conclusions: The CD73/A2A signaling axis plays an important role in development of GVHD in this humanised mouse model. Therefore, further investigation into therapeutic strategies that use a combination of A2A activation with other treatments is warranted.

Back to Top | Article Outline

Tissues/IRI/Microbiota, Innate Immunity/Metabolism

THE ROLE OF NORMOTHERMIC MACHINE PERFUSION IN THE DEVELOPMENT OF BILIARY INJURY IN DONATION AFTER CIRCULATORY DEATH DONORS (DCD)

REILING Janske1,2,3,4, LOCKWOOD DSR1,5, FORREST Elizabeth5,6, SIMPSON AH7, CAMPBELL CM8, BRIDLE KR1,2, SANTRAMPURWALA N1,2, BRITTON LJ1,2, CRAWFORD DHG1,2, DEJONG CHC9, and FAWCETT J1,2,3,5

1School of Medicine, University of Queensland, Brisbane, 2Gallipoli Medical Research Foundation, Qld, 3Princess Alexandra Hospital, Brisbane, 4Department of Surgery, University of Queensland, 5Queensland Liver Transplant Service, Princess Alexandra Hospital, Brisbane, 6Gold Coast Hospital and Health Service, 7Department of Cardiac Anaesthetics, Princess Alexandra Hospital, Brisbane, 8Envoi Specialist Pathologists, Brisbane, Australia, 9NUTRIM School of Nutrition and Translational Research in Metabolism, Maastricht University, Maastricht, the Netherlands

Aims: The use of marginal donor livers, initially deemed unsuitable for transplantation, is a potential means of increasing the donor pool. Machine perfusion is a developing technique that could allow for the use of marginal donor livers. The aims of this study were to examine the use of normothermic machine perfusion (NMP) and to outline its role in biliary injury.

Methods: A custom-built NMP circuit was used to perfuse human livers deemed unsuitable for transplantation. Serial perfusate samples were collected for biochemical and haematological analysis. Assessment of hepatocellular and biliary injury was conducted using bile, liver and bile-duct tissue biopsies, collected at retrieval, upon commencement and at the end of perfusion.

Results: Seven of the ten perfused livers cleared lactate below 2 mmol/L during the initial two hours of perfusion; this was associated with significantly lower levels of potassium, AST and γ-glutamyl transferase compared to non-lactate clearers. All donor livers demonstrated pronounced changes in the common bile duct (CBD) at the conclusion of NMP. The severity of histological duct injury in the CBD, though, did not correlated with changes in the left/right main ducts or the segmental ducts.

Conclusions: When using NMP, lactate clearance was seen as a satisfactory marker for the assessment of graft viability. NMP failed to prevent histologically severe biliary epithelial injury to the CBD. Given that injury to the larger hepatic ducts was not comparable to that of the CBD, a more reliable biomarker of bile-duct injury and hence DCD donor related ischemic-type cholangiopathy is required.

FIGURE 1

FIGURE 1

Back to Top | Article Outline

TRANPLANTING THE HUMAN RESPIRATORY VIROME

MITCHELL Alicia1,2,3, MOURAD Bassel3,4, MALOUF Monique2, BENZIMRA Mark2, MORGAN Lucy5,6, OLIVER Brian1,3, and GLANVILLE Allan2

1Molecular Biosciences, University of Technology, Sydney, 2Lung Transplant Unit, St Vincent's Hospital, Sydney, 3Cellular and Molecular Biology, Woolcock Institute of Medical Research, 4University of Technology, Sydney, 5Department of Thoracic Medicine, Concord Hospital, 6School of Medicine, University of Sydney

Introduction: The pulmonary component of the human respiratory virome (a subset of the human microbiome) is transplanted into the recipient at lung transplantation (LTX). We explored the role of intercurrent community acquired respiratory viruses (CARV) within the pulmonary virome.

Methods: Single centre, prospective, longitudinal study of viruses in recipient nasopharyngeal swabs prior to LTX, swabs of explanted lungs, donor lungs prior to implantation and bronchoalveolar lavage (BAL) on post-operative days (POD) 1, 7, 21, 42, 63, 84. Samples were processed to isolate nucleic acids, followed by RT-qPCR for CARV [human rhinovirus (HRV), respiratory syncytial virus, influenza A (Flu A) and B (Flu B), parainfluenza virus (PIV) 1, 2, 3, and human metapneumovirus].

Results: 27 consecutive LTX (bilateral: heart lung = 26:1) (age 48 ± 13 years, mean ±SD) (range 22–63) (M= 15) were recruited. Indications: cystic fibrosis (n=4), bronchiectasis (n=1), endocarditis (n=1), chronic lung allograft dysfunction (n=5), pulmonary fibrosis (n=5) and emphysema (n=11). Follow up was 106±57 days, range 16–199. 2 donors had Flu (A=1, B=1), recipients received oseltamivir. Despite vaccination and negative recipient NPS, 8 explanted lungs demonstrated Influenza (A=6, B=2). Flu A was detected in POD1 BAL in 13/23 and persisted for 3–6 weeks. 24 patients had CARV post LTX (17 on multiple BAL), including Flu A (n=21), HRV (n=14), Flu B (n=4), and PIV (n=2). HRV (n=10) and PIV (n=1) were co-detected with Flu A.

Conclusion: Donor transmission and early acquisition of CARV (particularly Flu A) occurs frequently indicating the importance of respiratory virome surveillance.

Back to Top | Article Outline

MAXIMIZING KIDNEYS FOR TRANSPLANTATION: THE PAST, PRESENT AND FUTURE OF MACHINE PERFUSION – A SYSTEMATIC REVIEW AND META-ANALYSIS

HAMEED AM1,2,3, PLEASS HC1,3,4, WONG G5,2,6,7, and HAWTHORNE WJ1,2,3

1Department of Surgery, Westmead Hospital, Sydney, 2Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 3School of Medicine, University of Sydney, 4Department of Surgery, Royal Prince Alfred Hospital, Sydney, 5Department of Renal Medicine, Westmead Hospital, Sydney, 6Centre for Kidney Research, The Children's Hospital at Westmead, Sydney, 7School of Public Health, University of Sydney

Aims: To elucidate the benefits of machine perfusion (MP) preservation with and without oxygenation, and/or under normothermic conditions, when compared to static cold storage (CS) prior to deceased donor kidney transplantation.

Methods: Articles were identified using the EMBASE, Medline and Cochrane databases. Meta-analyses were conducted for the comparisons between hypothermic MP (HMP) and CS (human studies) and normothermic MP compared to CS or HMP (animal studies). The primary outcome was immediate allograft function. Secondary outcomes included graft and patient survival, acute rejection and parameters of tubular, glomerular and endothelial function. Subgroup analyses were conducted in expanded criteria (ECD) and donation after circulatory (DCD) death donors.

Results: One-hundred and one studies (63 human and 38 animal) were included. There was a lower rate of delayed graft function in recipients with HMP donor grafts compared to CS (RR 0.77; 95% CI 0.69-0.87). Primary non-function was reduced in ECD kidneys preserved by HMP (RR 0.28; 95% CI 0.09-0.89). Renal function in animal studies was significantly better in normothermic MP kidneys compared to both HMP (standardized mean difference [SMD] of peak creatinine −1.66; 95% CI −3.19 to −0.14) and CS (SMD of peak creatinine −1.72; 95% CI −3.09 to −0.34).

Conclusions: MP enhances short and possibly longer-term outcomes after renal transplantation. However, there is still considerable room for modification of the process and further enhancement of renal preservation through oxygenation, perfusion fluid manipulation, and alteration of perfusion temperature. In particular, correlative experimental (animal) data provides strong support for more clinical trials investigating normothermic MP.

Back to Top | Article Outline

DEVELOPING A NORMOTHERMIC RENAL PERFUSION SYSTEM

HAMEED AM1,2,3, ROGERS N4,2,3, WARWICK N5, MIRAZIZ R1, WONG G4,2,6,7, EL-AYOUBI A2, BURNS H2, CHEW Y2, PLEASS HC1,3,8, and HAWTHORNE WJ1,2,3

1Department of Surgery, Westmead Hospital, Sydney, 2Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 3School of Medicine, University of Sydney, 4Department of Renal Medicine, Westmead Hospital, Sydney, 5Department of Anaesthesia, Westmead Hospital, Sydney, 6Centre for Kidney Research, The Children's Hospital at Westmead, Sydney, 7School of Public Health, University of Sydney, 8Department of Surgery, Royal Prince Alfred Hospital, Sydney

Aims: Machine perfusion (MP) preservation of the kidney has seen a resurgence over the past decade, especially with increasing utilization of donation after circulatory death (DCD) and expanded criteria donor (ECD) kidneys. There is increasingly compelling evidence from experimental work that normothermic machine perfusion (NMP) provides superior graft outcomes compared to both hypothermic MP and traditional static cold storage (CS), especially in the ECD and DCD cohort. We aimed to develop a feasible and safe preliminary NMP device for porcine kidney perfusion, ultimately working towards translation to the clinic.

Methods: A preliminary NMP device was custom-designed using componentry including cardiopulmonary bypass equipment. Components included a high-flow adjustable rate roller pump, allowing for flow rates of 1 L/min, a reservoir chamber with an in-built membrane oxygenator and heat exchanger, and Carmeda heparin coated poly-vinyl-chloride tubing. Perfusion fluid was comprised of packed red blood cells, suspended in crystalloid solution and supplemented with dextrose, nutrient media and heparin. Porcine kidneys were retrieved in a standard fashion (n = 4), and either immediately placed on NMP for 1 hour, or had NMP after 12–24 hours of CS.

Results: All kidneys successfully underwent NMP. Perfusion parameters, including intra-renal resistance, urine production, and biochemistry were recorded during the period of perfusion.

Conclusions: It is feasible to develop a safe and reliable renal NMP system. This will allow further research into the efficacy of this strategy in comparison to other kidney preservation modalities.

Back to Top | Article Outline

DONATION AFTER CARDIAC DEATH (DCD) PANCREAS TRANSPLANTATION: A WESTMEAD EXPERIENCE

SHAHRESTANI Sara1, ROBERTSON Paul2, PLEASS Henry3, JAMESON Carolyn4, YUEN Lawrence3, LAM Vincent3, RYAN Brendan3, ALLEN Richard3, and HAWTHORNE Wayne5,3

1School of Medicine, University of Sydney, 2Renal Transplant Unit, Westmead Hospital, Sydney, 3Department of Surgery, Westmead Hospital, Sydney, 4The Kolling Institute, University of Sydney, 5Centre for Transplant and Renal Research, The Westmead Institute

Aims: DCD pancreas transplants are rarely performed due to the risk of warm ischemic damage causing irreversible graft damage, however, with careful selection, successful transplantation of DCD pancreas grafts is possible. Our aims were to review outcomes from DCD pancreas transplants performed at Westmead Hospital to identify factors contributing to successful transplantation.

Methods: We reviewed all historical DCD pancreas transplants, and collected all information that may have contributed to the outcomes of these transplants. We compared age, body mass index (BMI), total ischemic time and warm ischemic time and reviewed the outcomes of the grafts.

Results: We identified seven patients who received DCD pancreas grafts performed between 2007–2015 at Westmead Hospital. We identified some unique characteristics that have potentially contributed to their successful outcomes. These included; young donor age with a mean of 18.1 years (range 12–32), low donor BMI with a mean of 23.9 (range 21.0-28.1), and minimal warm ischemic time with a mean of 17.4min (range 11-27min). Cold ischemic time was also minimised with a mean of 11 hours and 58min (range 7hrs 25min-19hrs and 10min), and in six of the seven donors, CIT was <13 hours. All grafts are successfully functioning at up to 9 years post-transplantation, with all patients being insulin independent.

Conclusion: Successful DCD pancreas transplantation is possible with the careful selection of donors. Graft function and insulin independence are achievable in DCD pancreas transplantation at up to 9 years.

Back to Top | Article Outline

THE IDENTIFICATION OF KEY DRIVER GENES IN ACUTE RENAL ALLOGRAFT REJECTION FROM HUMAN TRANSPLANT BIOPSIES

KEUNG KL1,2, YI Z3, LI L4, LU B1, HU M1, WEI C5, LIUWANTARA D1, MENON MC6, ALEXANDER S7, WONG G8,7, MURPHY B6, ZHANG WJ9, and O'CONNELL PJ1

1Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 2Centre for Immunology, 3Department of Genetics and Genomics, Icahn School of Medicine at Mount Sinai, New York USA, 4Department of Genetics and Genomics, Icahn School of Medicine at Mount Sinai, New York, USA, 5Renal Division, Icahn School of Medicine at Mount Sinai Hospital, New York New York USA, 6Renal Division, Icahn School of Medicine at Mount Sinai Hospital, New York USA, 7Centre for Kidney Research, University of Sydney, 8School of Public Health, Westmead Hospital, Sydney, 9Department of Genetics and Genomics, Icahn School of Medicine at Mount Sinai Hospital, New York USA

Gene expression profiling of biopsy tissue with acute rejection (AR) can uncover novel molecular targets for drug therapy.

Aims: 1. Identify hub genes (“key driver” genes) of AR from microarray datasets in kidney transplant recipients. 2. Apply a drug repurposing strategy to identify new therapies for AR 3. Validate these findings in an animal model.

Methods: We performed a meta-analysis of 7 publicly available microarray datasets (735 samples) of human renal allograft biopsies with and without AR. A key driver gene-set of AR was derived and entered into Connectivity Map (CMAP) to identify existing drugs that may be repurposed to target the imputed gene list.

Results: Using Markov Clustering Algorithm and Bayesian network construction we identified 10 modules and 14 key driver genes. The genes in the top three modules, which included 12 key drivers, are involved in the immune response, T and B cell activation, chemotaxis, and antigen presentation. CMAP identified a number of potential drug candidates including Minocycline. Minocycline was administered to murine heterotopic heart transplant recipients. RT-qPCR of mRNA isolated from rejecting hearts at day 4 post transplant demonstrated significantly lower expression (p value< 0.05) of 10 key driver genes in the Minocycline treatment group compared to control; pro-inflammatory cytokine and chemokine gene expression was similarly suppressed (Figure1). Immunohistochemistry staining of heart grafts showed reduced T cell and macrophage infiltration in the treatment group.

Conclusion: Novel molecular targets of AR can be identified from transcriptomic data, and drug repurposing tools can be used to discover potential new therapies.

Figure

Figure

Back to Top | Article Outline

ALTERATIONS OF THE ENDOTHELIAL GLYCOCALYX DURING RENAL ISCHEMIA-REPERFUSION INJURY IN MICE: ROLE OF COMPLEMENT INHIBITION BY ANTI-C5 AND HUMAN C1-INHIBITOR

BONGONI Anjan K1, LU Bo1, MCRAE Jennifer L1, SALVARIS Evelyn1, VIKSTROM Ingela2, BAZMORELLI Adriana2, PEARSE Martin J3, and COWAN Peter J1,4

1Immunology Research Centre, St Vincent's Hospital, Melbourne, 2Research and Development, CSL Limited, Parkville, Australia, 32Research and Development, CSL Limited, Parkville, Australia, 4Department of Medicine, University of Melbourne

Background: Within the endothelial glycocalyx, heparan sulfate (HS) plays a crucial role in regulation of cell-cell interactions, coagulation, and inflammation. Numerous factors, including complement activation and ischaemia-reperfusion (IR) injury (IRI), can modulate the integrity of the glycocalyx.

Aims: (1) To investigate whether renal dysfunction is associated with complement activation and HS shedding in a mouse model of renal IRI. (2) To examine the effect of inhibition of complement, using anti-mouse C5 antibody (BB5.1) or human C1-inhibitor (hC1-INH), on HS loss and renal function.

Methods: Male C57BL/6 mice were subjected to right nephrectomy and 22 min left renal ischemia at 37°C. Mice (n=8/group) were treated with BB5.1 (80mg/kg) or hC1-INH (800IU/kg) or isotype/vehicle control, before ischemia. Mice were sacrificed 24 hrs after reperfusion; blood samples were assessed for kidney function (serum creatinine) and kidneys for complement deposition, HS shedding and neutrophil infiltration.

Results: Severe renal injury was induced following IR (creatinine: 190.0±25.0μM versus Sham 31.2±1.7μM, p=0.02), accompanied by significant loss of HS, tubular complement C3b/c and C5 deposition and neutrophil infiltration. BB5.1 treatment protected against IR-induced renal dysfunction (creatinine: 121.0±10.0μM, p<0.02), and significantly (p<0.05) reduced complement deposition, HS shedding, and neutrophil infiltration. In contrast, hC1-INH was not protective in this study which may be attributed to the observed decreased in vitro potency observed against mouse complement factors.

Conclusion: Reduced renal function following IR is associated with loss of the endothelial glycocalyx. Effective complement inhibition, by blocking C5 activation, prevented destruction of the glycocalyx, thereby preserving kidney function in this model.

Back to Top | Article Outline

HYPOXIA PRECONDITIONING AND PROINFLAMMATORY CYTOKINE MODIFICATION OF CD45-TER119- MESENCHYMAL STEM CELLS (MSC) AS A NOVEL PROTOCOL TO ISOLATE IMMUNOSUPPRESSIVE MSC FROM COMPACT BONES.

SIVANATHAN Kisha N1,2, GRONTHOS Stan3,4, GREY Shane T5, and COATES Patrick T1,2,6

1Adelaide Medical School, Faculty of Health and Medical Sciences, University of Adelaide, 2Centre for Clinical and Experimental Transplantation, Royal Adelaide Hospital, 3Mesenchymal Stem Cell Laboratory, Adelaide Medical School, Faculty of Health and Medical Sciences, University of Adelaide, 4South Australian Health and Medical Research Institute, 5Transplant Immunology Group, Garvan Institute of Medical Research, Sydney, 6Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital

Aim: Compact bones (CB) represent an alternative to bone-marrow (BM) as a source to isolate MSC. Here, we established a protocol to isolate MSC from CB and tested their immunosuppressive potential.

Methods: Collagenase type II digestion of BM-flushed CB (from C57BL/6 mice) was performed to liberate MSC precursors from bone surfaces to establish nondepleted-MSC. CB cells were depleted of hematopoietic cells with anti-CD45 and anti-TER119 mAbs. These CD45TER119 CB cells were used to generate depleted-MSC. CB MSC progenitors were cultured under hypoxia to establish primary MSC cultures.

Results: Depleted-MSC compared to nondepleted MSC showed greater cell numbers at sub-culturing and had increased functional ability to differentiate into adipocytes and osteoblasts. CB depleted-MSC expressed key MSC markers (>85% Sca-1, CD29, CD90) with no mature hematopoietic contaminating cells (<5% CD45, CD11b) when sub-cultured to passage 5. Nondepleted-MSC cultures however were heterogeneous with <72% Sca-1+, CD29+ and CD90+ cells at early passages, as well as high percentages of contaminating CD11b+ (35.6%) and CD45+ (39.2%) cells that persisted in culture long-term. Depleted- and nondepleted-MSC nevertheless exhibited similar potency to suppress total, CD4+ and CD8+T cell proliferation, in a dendritic cell allostimulatory mixed lymphocyte reaction. Depleted-MSC, pre-treated with proinflammatory cytokines IFN-γ, TNF-α and IL-17A, showed superior suppression of CD8+T cell, but not CD4+T cell proliferation, relative to untreated-MSC.

Conclusion: CB depleted-MSC established under hypoxia and treated with selective cytokines represents a novel source of potent immunosuppressive MSC. As these cells have enhanced immunomodulatory function, they may represent a superior product for use in clinical allotransplantation.

Back to Top | Article Outline

DANTROLENE IMPROVES FUNCTIONAL RECOVERY OF DONOR HEARTS AFTER PROLONGED COLD STORAGE.

VILLANUEVA Jeanette1, GAO Ling1, CHEW Hong1, HICKS Mark1,2, DOYLE Aoife1, QUI Min Ru3, MACDONALD Peter1,4, and JABBOUR Andrew1,4

1Transplantation Laboratory, Victor Chang Cardiac Research Institute, Sydney, 2St Vincent's Hospital, Sydney, 3Department of Anatomical Pathology, St Vincent's Hospital, Sydney, 4Heart and Lung Transplant Unit, St Vincent's Hospital, Sydney

Aims: The ryanodine-receptor antagonist dantrolene (DANT) reduces cardiac ischaemia-reperfusion injury (IRI) in global warm-ischaemia models. We aimed to test whether DANT was cardioprotective following prolonged cold ischaemia of donor hearts.

Methods: Wistar rat (320-410g; n=4-7) hearts were subjected to ex-vivo perfusion and baseline haemodynamic measurements acquired. Hearts were arrested, stored in Celsior±0.4-40μMDANT (6h, 4°C), then reperfused (37°C, Langendorff 15min, followed by working 30min). Post-reperfusion cardiac output (CO) was expressed as a percentage of baseline measurements (mean±SEM). Lactate dehydrogenase (LDH) from coronary effluent, histology (H&E, cleaved-caspase 3) and western blots (cardioprotective signaling) on post-reperfusion left ventricle tissue were analysed.

Results: Compared to controls, 1μMDANT stored hearts showed significantly improved CO recovery (34%±8% vs 61%±10%; p=0.026); however, no difference in LDH was observed. Also, no increase in STAT3, ERK, or AKT activation was observed although there was a trend towards increased AMPKα phosphorylation. Beclin-1 levels were reduced (p=0.03). Histologically, no differences were observed in contraction band necrosis however 1μMDANT hearts exhibited significantly increased cleaved-caspase 3 positive nuclei compared to controls (p=0.0006). Hearts stored in 40μMDANT yielded poor CO recovery (4%±2%) and increased post-reperfusion LDH (p=0.04) suggesting toxicity.

Conclusion: 1μMDANT supplementation during prolonged donor heart cold storage significantly improved CO recovery. DANT-mediated cardioprotection was independent of ERK, AKT and STAT3 signalling. Reduced Beclin-1 in 1μMDANT stored hearts is consistent with increased autophagic flux. Increased cleaved-caspase 3 in 1μMDANT stored hearts suggests a potential cardioprotective role for activated caspase 3.

Back to Top | Article Outline

MITOCHONDRIAL SUPEROXIDE ACCUMULATION IS ASSOCIATED WITH EPITHELIAL PERMEABILITY AND APOPTOSIS IN THE LUNG ALLOGRAFT

SINCLAIR Kenneth Andrew1, YERKOVICH Stephanie Terase1,2, and CHAMBERS Daniel Charles1,2

1Lung Transplant Service, Prince Charles Hospital, Brisbane, 2School of Medicine, University of Queensland, Brisbane

Aims: The lung allograft is compromised in the long term by the degeneration of the pulmonary epithelium and subsequent fibrosis within the airways, through mechanisms that remain poorly understood. Recent studies have suggested that the accumulation of dysfunctional mitochondria is associated with tissue fibrosis. We hypothesised that this mechanism may also occur in the lung allograft. The aim of this study was to assess mitochondrial homeostasis in pulmonary epithelial cells from lung transplant recipients.

Methods: Bronchial epithelial cells were obtained from lung allografts by bronchial brush during routine bronchoscopies. Flow cytometry was used to identify epithelial cells (CD326+/EpCAM+), and to measure mitochondrial superoxide (Mitosox Red), apoptosis (Annexin V) and cellular permeability (7AAD). Data is presented as median (interquartile range).

Results: Bronchial brushings were collected from 38 lung transplant recipients (age = 52.4 years (41.4-60.6), 16 female (42%), 6.3 months post-transplant (2.4-51.1). Mitosox Red mean fluorescent intensity (MFI) strongly correlated with the percentage of 7AAD+ epithelial cells (r=0.85, p<0.001, Figure 1A). In contrast, there was a strong negative correlation between the percentage of Annexin V, 7AAD epithelial cells (r= −0.74, p=0.005, Figure 1B) and Mitosox expression. Mitosox Red was highest in lung allografts within the first 12 months post-transplant (58.6 (45.2-81.5)) compared to those greater than 12 months post-transplant (42.5 (35.8-60.0), p=0.03, Figure 1C).

Conclusions: Epithelial mitochondrial superoxide is elevated in the lung allograft early post-transplant, and is associated with epithelial apoptosis and permeability. These findings warrant further investigation into how the bioenergetic profile of the lung is altered after transplantation.

Figure

Figure

© 2017 The Authors. Published by Wolters Kluwer Health, Inc.