Secondary Logo

Share this article on:

The Transplantation Society of Australia and New Zealand Annual Scientific Meeting, Melbourne Convention Centre, 29th April-1st May, 2018

doi: 10.1097/TXD.0000000000000806
Abstracts
Back to Top | Article Outline

IRI and New Techniques

ISOLATING ISLETS FOR TRANSPLANTATION USING AN ISOLATOR SYSTEM, A VIABLE ALTERNATIVE TO CONVENTIONAL CLEAN ROOM FACILITIES

KOS C1, MARIANA L1, MCCORMICK K2, BLEASDALE N2, IRVIN A1, WAIBEL M1, THOMAS H1, and KAY T1

1Immunology & Diabetes, St Vincent's Institute, Melbourne, 2Australian Red Cross Blood Service

Aims: Islet transplantation is now used as therapy for type 1 diabetic patients with severe hypoglycaemic unawareness. In Victoria, we previously isolated islets in a conventional cleanroom facility. An equivalent facility was established on the St Vincent’s campus, in which processing occurs within an enclosed custom-built BioSpherix Xvivo biological system (‘isolator’) instead of open biosafety cabinets. By transferring this process to an isolator, we aimed to replicate our processes and procedures for equivalent and possibly improved islet yield outcomes.

Methods: The isolator contains three processing chambers, cell culture incubators, centrifuge and microscope modules. In contrast to conventional clean room facilities, islets processed in the isolator are maintained at optimal conditions in chambers capable of operating between 4°C and 45°C to maximize yields. Islets were isolated using a modified Ricordi method at both facilities.

Results: Isolations from 43 pancreata have been infused into patients in Melbourne, Adelaide and Sydney, with 22 of these processed using the isolator facility. Isolator Islet yields (280,103±173,108 islet equivalents, n=117) were significantly higher (p<0.0014*) than conventional cleanroom yields (207,870±139,962 islet equivalents, n=88) resulting in an 8% increase in the proportion transplanted.

Conclusions: We have replicated our cleanroom islet isolations using an isolator. Islet yields have increased overall, likely due to tight control of temperature and oxygen, as compared to the regulation of temperature with ice packs and water baths in the cleanroom process. The isolator facility provides a fully contained environment suitable for processing not only islets but also other human cells and tissue.

* Unpaired t-test

Back to Top | Article Outline

INTRA-RENAL DELIVERY OF DRUGS TAGETING ISCHEMIA-REPERFUSION INJURY OF THE KIDNEY IN A RODENT MODEL & PORCINE MODEL OF NORMOTHERMIC MACHINE PERFUSION

Ahmer HAMEED1,2, LU Bo1, MIRAZIZ Ray3, BURNS Heather1, ROGERS Natasha1,4, PLEASS Henry2,5, and HAWTHORNE Wayne2,1

1Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 2Department of Surgery, Westmead Hospital, Sydney, 3Department of Anaesthetics, Westmead Hospital, Sydney, 4Department of Renal Medicine, Westmead Hospital, Sydney, 5School of Medicine, University of Sydney

Aims: To investigate the utility of direct renal delivery of drugs targeting ischaemia-reperfusion injury (IRI) using machine perfusion (MP).

Methods: (i) Three different IRI drugs targeting the IRI process (CD47 blocking antibody, soluble complement receptor 1 [sCR1], and recombinant thrombomodulin [rTM]) were compared in a rodent model of renal IRI. A single drug or combination was delivered via the inferior vena cava after a right nephrectomy, and preceding induction of left kidney ischaemia. (ii) A normothermic MP (NMP) system was developed and optimized using a porcine donation after circulatory death (DCD) model. The impact of drug/s identified from (i) on NMP perfusion parameters and histology was subsequently investigated.

Results: (i) Preliminary evidence in the rodent renal IRI model was indicative of significant amelioration of IRI when CD47 and/or sCR1 were given prior to the ischaemic hit. Ongoing work is investigating whether the combined use of these agents will produce synergistic effects. (ii) A clinically translatable system of NMP was developed and optimized. Ideal perfusion conditions were produced when pressure-controlled perfusion, utilizing a leukocyte-depleted and colloid-containing perfusion solution were employed. Addition of CD47 to the perfusion circuit impacts on perfusion and biochemical parameters, including resistance, urine output and creatinine clearance. Ongoing work will test the utility of combined agents in this system.

Conclusions: Renal IRI can be ameliorated by the direct delivery of CD47 and/or sCR1. These drugs can be directly delivered to the kidney using NMP, thereby avoiding systemic treatment. The impact on transplantation outcomes remains to be investigated.

Back to Top | Article Outline

DONOR HEART PRESERVATION: APPLYING THE ACID TEST

SCHEUER Sarah1,2,3, GAO Ling2, HICKS Mark4, DOYLE Aofie1, VILLANEUVA Jeanette1, CHEW Hong1,2, JABBOUR Andrew1,2, KING Glenn5, MACDONALD Peter1,2,3, and DHITAL Kumud1,2,7

1Cardiac Physiology and Transplantation, Victor Chang Cardiac Research Institute, Sydney, 2Cardiopulmonary Transplant Unit, St Vincent's Hospital, Sydney, 3University of New South Wales, Sydney, 4Clinical Pharmacology, St Vincent's Hospital, Sydney, 5Institute for Molecular Bioscience, University of Queensland, Brisbane, 6Department of Medicine, University of New South Wales, Sydney, 7Department of Surgery, University of New South Wales, Sydney

Aims: This project sought to investigate the cardioprotective effects of Hi1a, an acid-sensing ion channel 1a (ASIC1a) inhibitor derived from funnel web spider venom, in the context of donor heart preservation.

Methods: Studies were conducted utilizing an isolated working rat heart model of donor heart preservation. Hearts were retrieved from male Wistar rats (350-450g) prior to obtaining baseline measurements of key haemodynamic parameters in both ex-vivo Langendorff and working modes. Hearts were then arrested with, and stored in, either Celsior alone, Celsior + GTN and Zoniporide, or Celsior + Hi1a (n=6, each group). Following an 8-hour cold storage, hearts were reperfused ex-vivo, and post-storage cardiac function (AF, CF, HR, CO) expressed as percentage recovery of pre-storage values.

Results: Following prolonged hypothermic ischaemia, hearts preserved and stored in Celsior solution supplemented with Hi1a [10 nM] demonstrated a superior recovery to those stored in un-supplemented Celsior; recovering, on average, an aortic flow of 49.73±23.71% and 7.84±12.94% of baseline measurements, respectively (p=0.01).

Conclusions: Supplementation of Celsior with the ASIC1a inhibitor, Hi1a, significantly improves the function of cardiac allografts following prolonged hypothermic ischaemia. Further studies will investigate the mechanisms responsible for its cardioprotective effects, as well as potential synergistic performance with the current clinically used supplements of GTN and EPO, in both DBD and DCD models of donor heart preservation. By reducing the incidence of ischaemic reperfusion injury, Hi1a may afford an opportunity to extend the current tolerable warm ischaemic time during DCD cardiac allograft retrieval, leading to a significant increase in transplant volume.

FIGURE 1

FIGURE 1

Back to Top | Article Outline

CYCLOPHILIN BLOCKADE PROTECTS FROM RENAL ISCHAEMIA/REPERFUSION INJURY

LEONG Khai Gene1,2, OZOLS Elyce1,2, KANELLIS John1,2, LILES John3, NIKOLIC-PATERSON David1,4, and MA Frank1,2

1Department of Nephrology, Monash Medical Centre, Melbourne, 2Centre for Inflammatory Diseases, Monash University, Melbourne, 3Other, Gilead Sciences, 4Other, Monash University, Melbourne

Cyclophilins are proteins that regulate protein folding. During pathological conditions, cyclophilin A (CypA) is an important pro-inflammatory molecule, while CypD facilitates mitochondrial-dependent cell death.

Aims: (1) Investigate whether a novel pan-cyclophilin inhibitor (CYPi), which does not block calcineurin function, can prevent anticipated renal ischaemia/reperfusion injury (IRI); (2) Assess the contribution of CypA in renal IRI.

Methods: Groups of 10 mice underwent bilateral renal ischaemia and were killed 24hr after reperfusion. Controls were sham operated. Study 1: C57BL/6J mice were treated with CYPi (30mg/kg/BID) or vehicle by oral gavage. Study 2: CypA-/- versus wild type (WT) mice on the 129 background. Effects of renal IRI were compared.

Results: Study 1: Renal IRI caused acute kidney injury (AKI) in C57BL/6J mice (179.0±19.1 vs 12.2±1.4umol/L serum creatinine (sCr) in sham; P<0.001). CYPi protected against AKI (sCr 36.4±6.7umol/L; P<0.001) and reduced the histologic tubular damage score (P<0.001). CYPi reduced apoptotic tubular cells, TNF-α mRNA levels and neutrophil and macrophage infiltration (all P<0.01 vs vehicle). Study 2: Renal IRI induced AKI in 129 mice (sCr 41±13.76 vs 6.25±1.66umol/L in sham; P<0.0001). CypA-/- mice were protected from renal dysfunction (sCr 20.7±3.53umol/L; P<0.001). CypA-/- mice had less histologic tubular damage (P<0.01) and lower KIM-1 mRNA levels (P<0.01). CypA-/- mice also had less tubular cell death (TUNEL+ cells), inflammatory cytokines (TNF-α and IL-36-α PCR; P<0.05), and reduced neutrophil infiltration (P<0.001).

Conclusions: Pharmaceutical pan-cyclophilin inhibition prevents anticipated IRI-induced AKI by suppressing tubular cell death and inflammation. Based on knockout studies, CypA specifically contributes to inflammation in renal IRI.

Back to Top | Article Outline

ACTIVATED CD47 PROMOTES ACUTE KIDNEY INJURY BY LIMITING AUTOPHAGY

EL RASHID Mary, SANGANERIA Barkha and ROGERS Natasha M

Westmead Institute for Medical Research

Background: Acute kidney injury (AKI) initiates a complex pathophysiological cascade leading to epithelial cell death. Recent studies identify autophagy, the mechanism of intracellular degradation of cytoplasmic constituents, as important in protection against injury. We have reported that the protein thrombospondin-1 (TSP1), and its receptor CD47, are induced in AKI, however their role in regulating renal injury is unknown.

Methods: Age and gender-matched wild-type (WT) and CD47-/- mice were challenged with renal ischemia reperfusion injury. All animals underwent analysis of renal function and biomolecular phenotyping. Human and murine WT and CD47-/- renal tubular epithelial cells (rTEC) were studied in vitro.

Results: CD47-/- mice were resistant to AKI, with decreased serum creatinine, and ameliorated histological changes compared to WT animals. CD47-/- mice demonstrated concurrent upregulation of key autophagy genes, including Atg5, Atg7, Beclin-1, and LC3 at baseline and post-AKI. WT mice demonstrated negligible autophagy expression at all time points. rTEC from CD47-/- mice displayed basal upregulation of autophagy that was preserved under hypoxic stress, and correlated with enhanced viability when compared to WT cells. Treatment of WT rTEC with a CD47 antagonist antibody or oligonucleotide to block TSP1-CD47 signalling increased autophagy. Finally, in a syngeneic mouse kidney transplantation model, treatment with a CD47 blocking antibody improved renal function and decreased histologic damage compared to control mice, and this was associated with increased autophagy.

Conclusions: These data suggest activated CD47 is a proximate promoter of AKI through inhibition of autophagy, and point to CD47 as a target to restore renal function following injury.

Back to Top | Article Outline

Transplant Complications

THE EFFECT OF POST-TRANSPLANT LYMPHOPROLIFERATIVE DISEASE (PTLD) ON GRAFT AND PATIENT SURVIVAL IN KIDNEY TRANSPLANT RECIPIENTS

FRANCIS Anna1, CRAIG Jonathan1, JOHNSON David2, and WONG Germaine1

1Centre for Kidney Research, University of Sydney, 2Renal & Transplantation Unit, University of Queensland at the Princess Alexandra Hospital

Aim: The aim was to estimate the excess risk of death and graft loss in kidney transplant recipients due to PTLD, and to determine risk factors for death.

Methods: Patients with PTLD in their first transplant (1990-2015) were identified from ANZDATA and matched to three controls. The risks of mortality and graft loss (with competing risk of death) were estimated using survival analysis and Cox models explored risk factors for death after PTLD.

Results: There were 395 patients with PTLD, of which 382 cases (68% male, 88% Caucasian, mean age 43 years) were matched to 1120 controls (58% male, 87% Caucasian, mean age 43 years). Mean follow up was 7.6 years (SD 5.8 years). 10-year survival (95%CI) was 40.8% (35.8%-46.6%) among recipients with PTLD compared to those without (65.4%, 62.3%-68.7%). (Figure 1) The excess mortality was all in the first year (HR 14.4, 95%CI 10.0-20.6), with no difference in mortality after 1 year (HR 1.17, 95%CI 0.92-1.47). The 10-year graft loss (95%CI) was similar for those with and without PTLD [16.4% (12.8%-21.1%) vs. 20.5% (18.0%-23.4%)]. Increasing age at diagnosis (per 10 years increased) [(adjusted HR:95%CI) 1.46:1.32-1.63], site of disease (brain compared to nodal) [1.98:1.28-3.06] and diagnosis before the year 2000 [2.55:1.59-4.09] were associated with an increased risk of mortality after PTLD.

FIGURE 1

FIGURE 1

Conclusions: PTLD increased the risk of mortality 14-fold in the first year after diagnosis only, with no effect on graft loss unrelated to death. PTLD site, increased age and diagnosis in earlier era were associated with increased mortality risk.

Back to Top | Article Outline

CANCER MORTALITY IN KIDNEY TRANSPLANT RECIPIENTS IN AUSTRALIA AND NEW ZEALAND: A COHORT STUDY FROM 1980 TO 2013

ROSALES Brenda1, DE LA MATA Nicole1, KELLY Patrick1, and WEBSTER Angela1,2,3

1Sydney School of Public Health, University of Sydney, 2Centre for Kidney Research, The Children's Hospital at Westmead, Sydney, 3Nephrology and Renal Transplant, Westmead Hospital, Sydney

Aims: International guidelines from 2009 suggest post-transplant screening for specific cancers, but any impact on recipient mortality for screened-cancers remains unclear. We compared death from all cancer and screened cancers for kidney transplant recipients versus the general population in Australia and New Zealand.

Methods: We conducted a population-based cohort study, using ANZDATA linked with Australian and New Zealand death registries, in incident kidney transplant recipients from 1980-2013. Cancers were categorised using ICD-10-AM codes. Standardised mortality ratios (SMR) were estimated using indirect standardisation.

Results: We included 17,621 recipients with 160,332 person-years (pys) of follow-up. Of 5,284 deaths, 1,063 (20.1%) were from cancer. Of cancer deaths, 293 (27.6%) were from screened-cancers, including; 75 colorectal, 72 renal, 69 melanoma, 26 breast, 26 liver, 19 prostate and 6 cervical. Cancer-related mortality rate was 663 per 100,000 pys (95%CI 624-704), and higher in men (727 per 100, 000 pys;95%CI 675-783). Transplant recipients were >3 times more likely to die cancer deaths (SMR 3.2;95%CI 3.03–3.4) compared to the general population. Screened-cancer mortality rates increased with age (Figure 1A). Relative mortality (SMR) decreased in men, as age increased (p<0.001), however was unchanged for women (p=0.4) (Figure 1B). Overall, screened-cancer deaths increased over time since 1980 (Figure 1C), however SMR remained steady (p=0.2) (Figure 1D).

FIGURE 1

FIGURE 1

Conclusions: Kidney transplant recipients are at increased risk of cancer death. Cancer death rates are increased over time but SMR remained stable suggesting similar changes in the general population. There was no evidence of impact of screening on mortality for kidney recipients.

Back to Top | Article Outline

THE PREVALENCE OF ACQUIRED CYSTIC KIDNEY DISEASE (ACKD) IS NOT INCREASED IN RENAL TRANSPLANT RECIPIENTS WITH RENAL TUMOURS.

RHEE Handoo, TAN Ai Lin, GRIFFIN Anthony, PRESTON John, LAWSON Malcom, and WOOD Simon

Renal Transplant Unit, Princess Alexandra Hospital, Brisbane

Aim: To determine the association between ACKD and renal malignancies in the renal transplant population

Methods: The study included literature review using PubMed and CINAHL according to the PRISMA guideline. Using the key words ((*transplant*[Title] and renal*[title]) AND (cancer*[Title] or mass*[Title] or malignan*[Title] or carcinoma*[Title] or lesion*[title])).

Results: In 13 studies that assessed specifically at the presence of ACKD in the context of renal malignancies, 59.95% (n=241/402) of patients with renal tumours had concurrent bilateral, and multiple cysts in the native kidneys. There was only 1 case report of ACKD associated clear cell RCC in the allograft. In renal transplant-population studies, 1% (n=97/9740) developed renal malignancies over 3 years. Of 97 patients, 67 (69%) had ACKD. Most studies identified ACKD with either routine ultrasound or computed tomography and only 4 patients presented with a symptom. 4 cancer specific deaths were reported during the follow up. Papillary renal cell carcinoma (RCC) was the most common (43%) subtype.

Discussion: Historically, ACKD has been reported in 40-60% of patients undergoing renal replacement therapy (a figure similar to ACKD found in patients with renal tumours). Although the presence of ACKD has been thought to be a strong predictor/harbinger of diagnosis of renal tumours in the future, this study raises the question: is ACKD a function of chronic insult to the kidneys or a cause of future malignancies. Understanding the aetiology of ACKD and associated RCC may provide insight into papillary RCC which currently has poor prognosis with metastasis.

Back to Top | Article Outline

COMPARISON OF TUMOUR CHARACTERISTICS IDENTIFIED IN THE ALLOGRAFT AND THE NATIVE KIDNEYS OF RENAL TRANSPLANT RECIPIENTS

TAN Ai Lin, WOOD Simon, PRESTON John, LAWSON Malcolm, GRIFFIN Anthony, and RHEE Handoo

Renal Transplant Unit, Princess Alexandra Hospital, Brisbane

Aims: The aim of this study is to determine the difference in the characteristics of renal tumours in the allograft or native kidneys of renal transplant recipients. Elucidating the subtleties may aid in the understanding of the pathophysiology behind cancer development in renal transplant recipients.

Methods: The study included literature review using PubMed and CINAHL according to the PRISMA guideline. Using the key words ((*transplant*[Title] and renal*[title]) AND (cancer*[Title] or mass*[Title] or malignan*[Title] or carcinoma*[Title] or lesion*[title])).

Results: Allograft tumours had a trend for papillary renal cell carcinoma (RCC) (37% vs 31%, p=0.00956) and less clear cell RCC (35 vs 41%, p=0.138). The size of the lesions in the allograft was smaller but not statistically significant (3.23 vs 3.97cm, p=0.433). The time to cancer development from renal transplant was however, longer with the allograft (10.63 vs 8.63 yrs, p=0.00244). Overall, the risk of cancer recurrence and cancer specific survival was also similar (88.4% (native) vs 90.8% (allograft)).

Conclusions: Given the similarities between the tumours identified in the allograft and the native kidneys in renal transplant recipients, similar pathophysiology maybe behind the development of renal tumours. This hypothesis is supported by previous reports where papillary RCC is slightly more dominant in ESRF population without transplant. Other cancer-associated aetiologies such as chronic inflammation may be a consideration. For example, papillary RCC is significantly more common in this population (15% in the general population). Papillary RCC has been associated with chronic inflammation markers such as IL-8.

Back to Top | Article Outline

FIRST REPORTED CASE OF GANCICLOVIR-RESISTANT POST-TRANSPLANT CYTOMEGALOVIRUS INFECTION DUE TO COMBINED DELETION MUTATION IN CODONS 595–596 OF THE UL97 GENE

LEUNG Po Yee Mia1, TRAN Thomas2, TESTRO Adam3, PAIZIS Kathy1, KWONG Jason4, and WHITLAM John1,5,6

1Department of Nephrology, Austin Hospital, Melbourne, 2Virus Identification Laboratory, Victorian Infectious Diseases Reference Laboratory, 3Liver Transplant Unit Victoria, Austin Hospital, Melbourne, 4Department of Infectious Diseases, Austin Hospital, Melbourne, 5Department of Medicine, University of Melbourne, 6Murdoch Childrens Research Institute, Royal Children's Hospital, Melbourne

The development of antiviral resistant cytomegalovirus (CMV) infection significantly complicates management of transplant patients.

Case: We describe a case of a 65-year-old male who developed breakthrough CMV disease (donor CMV IgG positive, recipient CMV IgG indeterminate) 30-days after a combined liver-kidney transplantation for alcoholic cirrhosis and hepato-renal syndrome. After an initial complete response to treatment dose of oral valganciclovir, he developed recurrent CMV viraemia. Resistance testing revealed a UL97 mutation with in-frame deletions of codons 595–596. He was treated successfully with foscarnet and reduction in immunosuppression. This mutation has not been previously described and was suspected to confer ganciclovir resistance, which was supported by his clinical course.

Discussion: Ganciclovir resistance occurs most commonly due to mutations in the UL97 gene or the UL54 gene, which encode a protein kinase and a DNA polymerase, respectively. The UL97-encoded protein kinase phosphorylates ganciclovir to ganciclovir-triphosphate, which competitively inhibits viral replication. Mutations in the UL97 gene are typically point mutations or deletions that prevent phosphorylation of ganciclovir to its active form.

Conclusion: We describe the case of a mutation, del595-596, in the CMV UL97 gene in the context of clinical treatment failure with standard and double-dose ganciclovir and successful virological control was achieved with foscarnet. Given the location of the mutation in the UL97 gene, it is likely to result in ganciclovir resistance, though recombinant phenotyping is required for confirmation.

Back to Top | Article Outline

RENAL TRANSPLANTATION IN THE ELDERLY POPULATION: SURGICAL OUTCOMES IN THE QUEENSLAND NETWORK OVER A 10 YEAR PERIOD.

FADAEE Neesa, ROBERTSON Ian, RHEE Handoo, and GRIFFIN Anthony

Renal Transplant Unit, Princess Alexandra Hospital, Brisbane

Aims: To analyse the surgical outcomes of kidney transplantation in patients aged over 70 including postoperative complications and graft function 3 months post-transplantation.

Method: A retrospective analysis of a prospectively maintained database was completed. Patients aged 70 years and older who received a kidney transplant at the Princess Alexandra Hospital over a 10 year period (January 2007- December 2017) were included in the study. Data was collected by completing patient chart reviews and results were analysed to determine surgical outcomes including graft function, dialysis requirements and creatinine levels three months post-transplantation.

Results: There were 46 patients included in this study with the mean age at the time of transplant being 71.8 years. 38 patients received a cadaveric graft. Majority of patients (n=28) had delayed graft function with only 43.5% (n=20) of patients reaching a normal creatinine (Cr 80–110) by 3 months. 15% (n=7) required dialysis within 72 hours post-op. The most common surgical complications included postoperative anaemia (n=5) and perinephric collection/haemorrhage (n=4). Only 2 patients required return to theatre with 1 patient requiring a transplant nephrectomy. The two most common post-transplant medical diagnoses were osteoporosis (n=14) and type 2 diabetes (n=12).

Conclusion: An ageing population with end stage renal disease combined with advances in transplantation have led to an increasing number of transplants in the over 70s population. It is crucial to identify complications of transplantation and asses their impact on graft function. Further research through a longitudinal or prospective comparative study is required.

Back to Top | Article Outline

EMPHYSEMATOUS PYELONPEHRITIS IN A DUAL KIDNEY TRANSPLANT RECIPIENT

TANGIRALA Nishanta1, SINGER Julian1, ANDERSON Lyndal2, LAURENCE Jerome3, and GRACEY David1

1Department of Renal Medicine, Royal Prince Alfred Hospital, Sydney, 2Department of Pathology, Royal Prince Alfred Hospital, Sydney, 3Department of Surgery, Royal Prince Alfred Hospital, Sydney

Case: A 54 year old diabetic man presented with one week history of fevers and abdominal pain. He received a dual-kidney deceased donor transplant thirteen months prior, achieving excellent graft function at one year (serum creatinine 97μmol/L). At presentation the patient was anuric with a serum creatinine of 518μmol/L and a white cell count of 25.32 x 109/L. A non-contrast computed tomography scan revealed an enlarged right iliac fossa kidney with gas throughout the renal parenchyma and transplant renal vein, extending into the great saphenous veins (Figure 1). There was no air within the collecting system and the second allograft was grossly normal. The patient received antibiotics and underwent an urgent allograft nephrectomy of the emphysematous kidney. The second renal allograft was left in situ. Escherichia coli was grown in blood and urine cultures. Antibiotics and circulatory support were administered. Haemodialysis was required for fourteen days post nephrectomy prior to recovery of the remaining allograft. Eight weeks post nephrectomy the serum creatinine was 180μmol/L.

Figure

Figure

Discussion: Emphysematous pyelonephritis (EPN) is a fulminant, necrotizing infection of the renal parenchyma, from gas producing organisms. There have been 26 reported cases of EPN in renal allografts, however, to our knowledge this is the only reported case in a patient with dual allografts. It remains a rare but devastating complication following transplant, and is associated with high risk of mortality. Early recognition and prompt nephrectomy can be life saving, and in this case also enabled retention of the remaining allograft leading to dialysis free survival.

Back to Top | Article Outline

GLP1RA SUCCESSFULLY TREATS HYPERGLYCAEMIA IN RENAL TRANSPLANT RECIPIENTS AND ENABLES SUBSTANTIAL REDUCTION IN INSULIN REQUIREMENTS AND WEIGHT

KAMESHWAR Kamya1, FOURLANOS Spiros2,3, HIDAYATI Leny1, CHEONG Jamie1, LEVIDIOTIS Vicki1, and COHNEY Solomon1,3

1Department of Nephrology, Western Health, Victoria, Australia, 2Department of Endocrine and Metabolism, Royal Melbourne Hospital, 3Faculty of Health Sciences, University of Melbourne

Background: While the range of glucose lowering therapies for diabetes mellitus (DM) has recently increased, CKD, ESKD and renal transplant recipients continue to be largely restricted to sulphonylureas and insulin, with potential for weight gain and hypoglycaemia. Of additional concern, obesity now commonly complicates ESKD and transplant management. GLP1 receptor agonists (GLP1RA) facilitate weight loss, carrying minimal risk of hypoglycaemia when used without insulin or sulphonylureas. We studied 19 renal transplant recipients receiving GLP1RA for DM predating transplantation (PEDM) or post-transplant diabetes mellitus (PTDM).

Methods: Nineteen patients taking B.D. exenatide (Byetta) or weekly exenatide (Bydureon) were studied prospectively following renal transplantation (11 PEDM, 8 PTDM).

Results: Patients were on average 60 months post-transplant (range 7 to 120); 11 on insulin (10 PEDM, 1 PTDM), mean weight 87kg (range 64 to 108), mean HbA1c 7.8% (range 6.1 to 9.7), mean creatinine 125μmol/L (range 65 to 187). After a median 10.5 months (range 3–36), 3 PEDM patients were insulin free, while the remainder had a 60% reduction in insulin requirements (IR). Weight fell by a mean 4kg (median −4, range −29 to +6) with weight loss greatest in those with highest baseline IR. HbA1c fell on average by 0.1% (range −2.6 to 1.8). Significant gastrointestinal side-effects occurred in 8 patients receiving Byetta, 2 tolerated half-maximal dose, and a third successfully converted to Bydureon.

Conclusion: GLP1RA in renal transplant recipients with diabetes enabled reduction in IR and weight, with improved glycaemic control in some patients. Further evaluation in larger randomised trials is warranted.

Figure

Figure

Back to Top | Article Outline

ONE YEAR INCIDENCE & PREVALENCE OF NEWLY DETECTED ABNORMAL GLUCOSE METABOLISM IN RENAL TRANSPLANT PATIENTS ON MAINTENANCE PREDNISOLONE AND CNI (2004–2009)

PIMENTEL AL1,2, MASTERSON R2, YATES C3,4, HUGHES P2, CAMARGO JL1,5, and COHNEY S6,7,8

1Graduate Program in Endocrinology, Universidade Federal do Rio Grande do Sul (UFRGS), 2Department of Nephrology, Melbourne Health, 3Department of Diabetes and Endocrinology, Melbourne Health, 4Department of Endocrinology, Western Health, 5Department of Endocrinology, Hospital de Clinicas de Porto Alegre (HCPA), 6Department of Nephrology, Western Health, 7Department of Medicine, University of Melbourne, 8Department of Epidemiology, Monash University, Melbourne

Aims: The reported incidence of newly diagnosed diabetes after renal transplantation (PTDM) varies widely according to definitions and immunosuppressive regimen. Data on PTDM incidence amongst patients receiving modest corticosteroid and lower maintenance CNI is sparse. This study analysed PTDM incidence within a single centre cohort of 534 consecutive patients transplanted from 2004 to 2009, and followed to 2017.

Methods: Patients received a single methylprednisolone pulse ≤500mg at transplant, 20 to 25 mg prednisolone for the first month tapered to 5mg by 8 to 12 weeks, tacrolimus ≤7ng/ml after week 8, and ≤4ng/ml beyond 12 months with analysis based on a combination of prospectively recorded data from an electronic database, medical records and ANZDATA. Pre-existing diabetes mellitus (PEDM) and PTDM diagnosed during the first year post-transplant were based on glucose levels, HbA1c and/or use of glucose lowering therapy (GluLT).

Results: Of 534 patients, 63 had PEDM and 64 (13.6%) developed PTDM during the first year, 83% of the cases occurring in the first 3 months (Table 1). Amongst those with PTDM, 48 started GluLT - predominantly metformin, insulin and/or gliclazide. Glucose levels returned to normal in 6 patients who discontinued GluLT, leaving a PTDM prevalence of 12.3% at one year. Twelve patients had transient hyperglycaemia that normalized within days/weeks without treatment and were not considered PTDM.

TABLE 1

TABLE 1

Conclusions: In this single centre cohort of renal transplant recipients maintained on low dose tacrolimus, mycophenolate and prednisolone, the1-year incidence and prevalence of PTDM were 13.6% and 12.3% respectively, with 75% on any GluLT.

Back to Top | Article Outline

METFORMIN, GLICLAZIDE AND INSULIN REMAIN THE MOST COMMONLY USED AGENTS FOR POST-TRANSPLANT DIABETES (PTDM) IN A COHORT OF RENAL TRANSPLANT RECIPIENTS

PIMENTEL AL1,2, MASTERSON R2, YATES C3,4, HUGHES P2, and COHNEY S5,6,7

1Graduate Program in Endocrinology, Universidade Federal do Rio Grande do Sul (UFRGS), 2Department of Nephrology, Melbourne Health, 3Department of Diabetes and Endocrinology, Melbourne Health, 4Department of Endocrinology, Western Health, 5Department of Nephrology, Western Health, 6Department of Medicine, University of Melbourne, 7Department of Epidemiology, Monash University, Melbourne

Aims: Given the paucity of literature on treatment of PTDM, this study was undertaken to analyse glucose lowering therapy (GluLT) usage in a sizeable cohort of PTDM patients.

Methods: Review of all renal transplant recipients transplanted between December 2004–2009 followed to December 2017 using data collected prospectively from an electronic database and medical records. HbA1c, glucose levels and/or use of hypoglycemic therapy identified patients with PTDM.

Results: Amongst 534 patients, 86 developed PTDM with 59 commencing GluLT: 3 insulin monotherapy, 11 metformin, 12 gliclazide. An additional 29 received metformin in combination with other therapies (including 14 taking insulin), 3 on insulin in combination with other drugs and 1 patient on gliclazide and linagliptin. Glucose metabolism normalised by 12 months in 6 patients, and GluLT was discontinued. 23/40 patients commencing metformin remained on it at end of follow-up; cessation of metformin was due to resolution of PTDM in 4 patients, 1 when renal function deteriorated, 2 because of gastrointestinal symptoms, and uncertain in 9 patients (1 lost to follow-up. No cases of lactic acidosis were reported. Eleven patients commenced on newer GluLT (6 GLP1 receptor agonist, 3 SGLT2i, 7 DPP4i). There was no indication of any difference in outcome according to GluLT. Conclusions: Insulin, metformin and gliclazide were the commonest glucose lowering therapies prescribed with intolerance to Metformin uncommon. Prospective studies on GLuLT are needed preferably with outcome data and randomized if possible.

Back to Top | Article Outline

ABSENT SMOOTH MUSCLE ACTIN IMMUNOREACTIVITY OF THE SMALL BOWEL MUSCULARIS PROPRIA CIRCULAR LAYER – A NOVEL FINDING IN A DYSMOTILE INTESTINAL ALLOGRAFT

HARDIKAR Winita1, BOLIA Rishi1, STARKEY Graham2, TESTRO ADAM2, HOLMES Kathe1, MURPHY Samantha1, and JONES Robert2

1Department of Gastroenterology, Royal Children's Hospital, Melbourne, 2Department of Liver Transplantation, Austin Hospital, Melbourne

Background: Intestinal dysmotility leading to poor graft function is a recognized complication of intestinal transplantation however the causes are poorly understood.

Aim: To describe a novel potential cause of dysmotility in an intestinal graft.

Case report: A 14 year old male, underwent combined liver/intestinal transplant for Hirschsprungs disease in March, 2012. 3 years post-transplant he developed recurrent episodes of increased stoma loses, nausea and abdominal pain requiring repeated hospital admissions. Rejection, viral infection, bacterial overgrowth and PTLD were excluded. Exploratory laparotomy revealed no stricture or obstruction. Around 5cm of proximal end of donor jejunum was resected and a Roux loop jejunojejunostomy was created. The roux loop was pulled up to create a gastrojejunostomy. This procedure relieved the symptoms. Histopathology of the resected allograft specimen showed an abrupt segmental disappearance of the inner, circular layer of muscularis propria in the donor jejunum. Immunohistochemical staining of multiple blocks showed almost complete loss of staining for smooth muscle actin in the inner, circular layer of muscularis propria with preservation in the outer layer. This finding has been described in chronic intestinal pseudo-obstruction and in megacystis-microcolon. Occurrence of these changes in the jejunum, as seen in our patient, is pathological.

Conclusion: We present a novel finding of acquired loss of alpha-smooth muscle actin immunostaining in circular muscle myocytes in an intestinal allograft which may be a cause of dysmotility. The cause of this acquired change is uncertain but immune mediated destruction of enteric myocytes by host immune cells is worthy of further exploration.

Back to Top | Article Outline

INTRACTABLE ASCITES FOLLOWING RENAL TRANSPLANT IN AUTOSOMAL DOMINANT POLYCYSTIC KIDNEY DISEASE PATIENTS WITH MASSIVE POLYCYSTIC LIVER

MARUI Yuhji1, FUJIMOTO Eisuke1, AIDA Koichiro1, SASAKI Hideo1, KOIZUMI Satoshi2, OTSUBO Takehito2, and CHIKARAISHI Tatsuya1

1Department of Urology, St Marianna University School of Medicine, 2Division of Gastro-enterological and General Surgery, St Marianna University School of Medicine

Aim: To consider the mechanism of the intractable ascites developed in renal transplant (RTx) recipients with autosomal dominant polycystic kidney disease (ADPKD) with massive polycystic liver.

Case 1: A 60s year-old man developed massive ascites 3 years after RTx despite bilateral diminishment of native kidneys. MRI revealed multiple liver cysts distributing throughout all hepatic segments, and extrinsic compression of inferior vena cava (IVC). (Figure) The ascites became symptomatic and resistant to medical treatment with diuretics and paracentesis. As any surgical revisions were unsuitable, he underwent peritoneovenous shunt placement. Then the ascites was managed by least diuretics with durable relief of symptoms, and renal function improved.

Figure

Figure

Case 2: A 50s year-old man with ADPKD who had undergone pre-emptive RTx and simultaneous left nephrectomy 4 years before, developed massive ascites following acute colitis. MRI revealed massively enlarged polycystic liver and severe stenosis of the intrahepatic part of IVC. As the ascites became reftactry to medical treatment, he underwent hepatic resection and cyst fenestration. After these procedure the ascites resolved followed by improvement of renal graft function.

Discussion: In these recipients, in addition to enlarging multiple hepatic cysts, the lost of supporting effect by bilateral renal enlargement buttressing up the liver might have caused that liver shifted inferiorly, which resulted in severe stenosis of the intrahepatic part of IVC. In case 2 due to colitis induced dehydration the consequent reduction of intracaval blood flow might have led critical stenosis of IVC. The improvement of graft function following intervention was noteworthy.

Back to Top | Article Outline

EARLY REMOVAL OF JJ STENTS IN RENAL TRANSPLANT RECIPIENTS : A PILOT STUDY OF FEASIBILITY AND SAFETY

JAMBOTI Jagadish1,2,3, BHANDARI Mayank1,4, GODDARD Kim1,4, NAVADGI Suresh1,4, SWAMINATHAN Ramyasuda1,2, ABRAHAM Abu1,2, IRISH Ashley1,2,4, TAN Andrew1,4, PUTTAGUNTA Harish1, O'BRIEN Orla1, WARGER Anne1, STINNETTE Megan1, and PRADABHAN Salivahane1

1Renal Transplant Unit, Fiona Stanley Hospital, 2School of Medicine & Pharmacology, University of Western Australia, 3School of Medicine, Notre Dame Medical School, Fremantle, 4WA Liver & Kidney Transplant Service,

Background: JJ stents are routinely inserted intra-operatively in renal transplant recipients (RTR) to minimize early post-surgical complications including urinary leak and ureteric obstruction. Usual clinical practice is to remove the JJ stents cystoscopically at 4–6 weeks. However, the optimum duration of the stents has not been established. Stent placement can cause dysuria, urinary tract infections, ureteric epithelial ulceration and inflammation with longer dwells. Use of JJ stents was reported to be a risk factor for developing BK Virus infection. Studies have looked at reducing the complications of JJ stents by early removal.

Methods: After extensive literature review and discussions, we have initiated early JJ stent removal protocol in RTR. Intra operatively, JJ stent is attached to the Indwelling Catheter (IDC) by tying the distal end of the JJ stent string to the IDC. The IDC and JJ stent are together removed by the bedside on post transplant Day 5.

RESULTS: In the 4 RTR trialed so far (3 males; age 17–61 years), early stent removal has been achieved satisfactorily, with excellent patient acceptance and no clinical or imaging evidence of any complications. 2 RTR received kidneys from live donors, and 2 from deceased donors.

CONCLUSIONS: In this ongoing safety and feasibility study, we aim is to document the safety and efficacy of early removal of JJ stents in a larger cohort. We envisage a prospective randomized study for documenting the efficacy of this approach in preventing major urologic complications and in reduction of BKV infection in RTR.

Back to Top | Article Outline

OUTCOMES OF EARLY URETERIC STENT REMOVAL IN PAEDIATRIC KIDNEY TRANSPLANTATION

NG Zi Qin1, and HE Bulang1,2

1WA Liver & Kidney Transplant Service, Sir Charles Gairdner Hospital, Perth, 2Department of Surgery, University of Western Australia, Perth

Introduction: In kidney transplantation, the timing for removal of ureteric stent is debatable and usually it is removed at 4–6 weeks post-surgery by cystoscopy. The aim of this study is to review the outcomes of early removal of ureteric sent simultaneously with removal of the indwelling urethral catheter in paediatric transplant recipients.

Methods & Materials: A retrospective review was performed for all paediatric transplant recipients from 2009 to 2017. The refinement to kidney transplantation was that the end of ureteric stent was connected to the tip of indwelling urethral catheter (IDC) by a suture-string. As such the stent can be removed concurrently at the time of IDC removal 5–7 days after transplantation. The data of demographics, episode of infection, urological complications and kidney function was collected for analysis.

Results: Overall, there were 28 cases of paediatric kidney transplantation, age from 2 to 18 (median 10.5 years). There were 23 males and 5 female patients. Twenty-six patients had early stent removal. There were no cases of urine leakage. One recipient developed distal ureteric stenosis which resolved after interventional balloon dilatation. Two cases developed BK nephropathy 4–6 months post transplantation. One case had urinary tract infection (<3 months).

Conclusion: Early removal of ureteric stent is safe and feasible without increasing the risk of urological complications. It helps significantly in cost-saving and improves patient quality of life.

Back to Top | Article Outline

PARTIAL VERSUS COMPLETE THROMBOSIS MODERATED BY INTRA-OPERATIVE VASOPRESSOR USE IN SPK PATIENTS

SHAHRESTANI Sara1,2, HORT Amy3, SPIKE Erin3, GIBBONS Thomas1, HITOS Kerry3, ROBINSON Paul4, KABLE Kathy4, LAM Vincent3, DE ROO Ronald3, YUEN Lawrence3, PLEASS Henry3, and HAWTHORNE Wayne2

1Western Clinical School, University of Sydney, 2Centre for Transplant and Renal Research, Westmead Hospital, Sydney, 3Department of Surgery, Westmead Hospital, Sydney, 4Department of Renal Medicine, Westmead Hospital, Sydney

Aims: Simultaneous pancreas-kidney (SPK) transplantation is the gold standard treatment for patients with type 1 diabetes and end stage renal failure. Thrombosis is a devastating complication of SPK that can result in graft loss and return to theatre for pancreatectomy.

Methods: We reviewed 235 SPKs performed at Westmead hospital over the past decade (2008–2017). We examined risk factors (donor and recipient) and characteristics of thrombosis in order to ascertain the clinical course for patients.

Results: 41 (17.4%) of patients experienced a thrombosis. In 85% (35/41) of cases, this thrombosis occurred early, in the first 6 weeks following transplantation. The majority of thromboses (68%, n=28/41) were venous. Importantly thrombosis associated with graft loss and pancreatectomy only occurred in less than half of the patients with a thrombosis (n=17, 7.2%). Graft loss was strongly associated with the use of intraoperative vasopressors with 71% (n=12/17) of the patients that lost their graft requiring intraoperative vasopressors, while only 46% (n=11/24) of those with partial thrombosis required this intervention.

Conclusion: While graft thrombosis is a devastating complication of SPK transplantation that leads to graft loss, it is reassuring to know that less than half of the grafts that thrombose are lost and require return to theature for pancreatectomy. A strong risk factor for thrombosis leading to graft loss is the use of intra-operative vasopressors, leading us to believe careful management of blood pressure may be key to reducing the devastating outcomes of this not uncommon complication.

Back to Top | Article Outline

MORTALITY RATES IN LIVING KIDNEY DONORS: AN AUSTRALIAN AND NEW ZEALAND COHORT STUDY USING DATA LINKAGE

DE LA MATA Nicole1, CLAYTON Philip2,3, MCDONALD Stephen4,5,2, CHADBAN Steven2,6,7, POLKINGHORN Kevan8,9, and WEBSTER Angela1,10

1Sydney School of Public Health, University of Sydney, 2ANZDATA, 3Faculty of Health Sciences, University of Adelaide, 4Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital, 5Department of Medicine, University of Adelaide, 6Transplantation Services, Royal Prince Alfred Hospital, Sydney, 7Sydney Medical School, University of Sydney, 8Department of Epidemiology and Preventative Medicine, Monash University, Melbourne, 9Department of Nephrology, Monash Medical Centre, Melbourne, 10Centre for Transplant and Renal Research, Westmead Hospital, Sydney

Aims: Living kidney donors are a highly selected group whom could be expected to have better than average life expectancy. We aimed to compare deaths in living kidney donors with the general population.

Methods: We included all living donors in Australia and New Zealand from 1996 to 2013. Where dead, we established the primary cause of death using data linkage between the Australian and New Zealand Living Kidney Donor Registry and national death registries: Australia, 1996–2013 and New Zealand, 2003–2013. Standardized mortality ratios (SMR) were estimated using indirect standardization.

Results: Among 3,374 living kidney donors, there were 35 deaths in 22,551 person-years (pys) of follow-up. The most common cause of death was cancer (n=19), followed by coronary heart disease (n=3) and accidental deaths (n=3). Donors who had died were generally older than those still alive, having a median age of 61 years [IQR: 57–64].The crude mortality rate during the first year from donation was 148 (95% CI: 62–357) per 100,000 pys and increased to 187 (95% CI: 70–498) per 100,000 pys at 5 years since donation. The overall SMR was 0.32 (95% CI: 0.23-0.45), where living kidney donors had 68% fewer deaths than expected in the general population of the same age and sex. There were few differences in SMR by sex or age (Fig. 1).

FIGURE 1

FIGURE 1

Conclusion: All-cause mortality was significantly lower among living kidney donors compared to the general population, with no evidence of increased deaths from any cause.

Back to Top | Article Outline

POST-TRANSPLANT ACUTE KIDNEY INJURY AFFECTS LONG-TERM GRAFT FUNCTION

PRAKASH MP1, ZHUO Tally1, HEDLEY James2, WEBSTER Angela1, and ROGERS Natasha1,3

1Westmead Clinical School, Westmead Hospital, Sydney, 2School of Public Health, University of Sydney, 3Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney

Background: Acute kidney injury (AKI) is a common clinical condition affecting the hospitalized population, and is associated with significant morbidity and mortality. Kidney transplant recipients are predisposed to AKI due to surgical complications, medication toxicity and susceptibility to infection. However, existing literature on the frequency of AKI and its impact on graft survival is limited.

Aims: The aims of this study were to look at the incidence, aetiology and outcome of AKI in a kidney transplant population.

Methods: A retrospective cohort analysis was undertaken on recipients of a kidney or simultaneous pancreas-kidney (SPK) transplant. Patients were transplanted at Westmead Hospital from 2000–2017, and had graft survival >6 months. Baseline demographic data and information regarding post-transplant AKI were collected using recipient medical records. Episodes of AKI were defined as >25% elevation in creatinine not caused by rejection or BK virus. Primary outcome was graft dysfunction by measured glomerular filtration rate (GFR).

Results: We identified 1411 transplant recipients matching the initial inclusion criteria, of which 392 had follow-up. Two hundred and six recipients had at least one episode of AKI (mean 2.1±1.5), with the first episode occurring after a mean of 28 ± 50months. Compared to those with no AKI, AKI in the first year was associated with a 9.7ml/min decrease in GFR within 1 year (95% CI 5.1 - 14.3ml/min, p<0.001). Furthermore, 11 patients (5%) demonstrated no deleterious effect of AKI on GFR.

Conclusion: AKI following kidney transplantation is common, frequent and is associated deleterious changes in GFR.

(ctd)

Table

Table

Back to Top | Article Outline

RESIDUAL RISK OF BLOOD BORNE VIRUS INFECTION WHEN AUSTRALIAN ORGAN DONOR REFERRALS TEST NEGATIVE: A SYSTEMATIC REVIEW AND META-ANALYSIS

WALLER Karen1, DE LA MATA Nicole2, WYBURN Kate3,2, KELLY Patrick2, VIDIYA Ramachandran4, RAWLINSON William4, and WEBSTER Angela2,5

1School of Public Health, 2School of Public Health, University of Sydney, 3Department of Renal Medicine, Royal Prince Alfred Hospital, Sydney, 4Serology and Virology Division, South Eastern Area Laboratory Services (SEALS) Pathology, Prince of Wales Hospital, Sydney, 5Centre for Transplant and Renal Research, Westmead Hospital, Sydney

Introduction: Donor referrals with increased risk behaviours who test negative for blood borne viruses are currently submitted to recipient teams as suitable organ donors. However window period infections (the period between infection and tests becoming positive) may pose risk to recipients unless suitable screening is undertaken.

Aim: To estimate the prevalence and incidence of hepatitis B (HBV), hepatitis C (HBV) and HIV (human immunodeficiency virus) among increased risk groups in Australia, and hence infer the residual risk of window period infection for organ donors with negative testing.

Methods: We performed a systematic review and meta-analysis including studies 2000–2017 reporting original estimates of Australian HIV, HCV or HBV prevalence or incidence in increased risk groups. Pooled prevalence and incidence rates were estimated using random effects. The probability of window period infection was estimated by assuming days since infection followed an exponential distribution.

Results: We included 55 studies (353,846 participants), with most data for MSM (men who have sex with men), IVDU (intravenous drug users) and prisoners; HIV and HCV. The absolute residual risk of HIV infection remained low in all cases; the highest was MSM with up to 9 window period infections per 10,000 not detected with negative enzyme immunoassay (EIA) testing alone (Table 1). HCV and HBV incidence was highest in IVDU, with up to 158 window period HCV cases per 10,000 people not detected by EIA testing alone.

TABLE 1

TABLE 1

Conclusions: BBV risk estimates inform decisions about increased risk donor referrals. Negative NAT substantially reduces window period risks.

(ctd)

Back to Top | Article Outline

Regulatory T cells

HIGH FIBRE DIET INDUCES DONOR SPECIFIC TOLERANCE OF KIDNEY ALLOGRAFT THROUGH SHORT CHAIN FATTY ACID INDUCTION OF TREGS

WU Huiling1,2, KWAN Tony2, LOH Yik Wen2, WANG Chuanmin1,2, MACIA Lanrence3, ALEXANDER Stephen4, and CHADBAN Steven1,2

1Department of Renal Medicine, Royal Prince Alfred Hospital, Sydney, 2Kidney Node Lab, The Charles Perkins Centre, University of Sydney, 3Nutritional Immunometabolism Lab, The Charles Perkins Centre, University of Sydney, 4Department of Nephrology, The Children's Hospital at Westmead, Sydney

Aim: To investigate the impact of high fibre diet(HFD) or dietary supplementation with sodium acetate(SA) on kidney allograft rejection and survival in mice.

Methods: Life sustaining kidney transplants were performed: B6 to B6 isografts and BALB/c to B6 WT or B6 GPR43−/− mice as allografts. Mice were fed HFD for two weeks prior and throughout experiments(Allo+HFD), or received SA 200mg/kg ip for 14 days post-transplantation then SA 150mM solution orally(Allo+SA; GPR43−/−+SA). Allograft controls received normal chow only(Allo). To deplete CD4+CD25+ cells, selected groups received anti-CD25mAb(PC61).

Results: HFD preserved renal allograft function and prolonged allograft survival compared to control-allografts(Figure 1 p<0.01). HFD increased the release of SCFAs, particularly acetate. Similarly, Allo+SA allografts were protected from both acute and chronic allograft rejection with better renal function(p<0.05), less tubulitis(p<0.001) and increased CD4+Foxp3+ Treg accumulation at day14 post-transplant, and improved renal function(p<0.05) and less proteinuria(p<0.001) at day 100 post-transplant versus control-allografts. Allo+SA allografts exhibited superior survival to control-allografts(Fig.1, p<0.05) due to the development of donor antigen specific tolerance, confirmed by acceptance of donor strain but rejection of 3rd party skin grafts. The survival benefit conferred by SA was broken by depletion of CD25+ Tregs(p<0.05). SA treatment was ineffective in GPR43−/− allograft recipients(GPR43−/−+SA, p<0.05).

Figure

Figure

Conclusions: HFD or supplementation with SA induced donor specific kidney allograft tolerance in a fully MHC mismatched murine model of kidney allograft rejection. Tolerance was dependent on a CD4+CD25+FoxP3+ regulatory mechanism. GPR43 is required for the molecular action of SA induced donor specific tolerance of kidney allografts.

Back to Top | Article Outline

IN VITRO EVALUATION OF HUMAN REGULATORY T-CELLS IN A 3D-PRINTED STRUCTURE

KIM Juewan1, YUE Zhillian2, LIU Xiao2, HOPE Christopher3, ROJAS-CANALES Darling4,5, DROGEMULLER Christopher4,5, CARROLL Robert4,5, BARRY Simon C7, WALLACE Gordon G2, and COATES P. Toby4,5

1The Department of Molecular & Cellular Biology, The School of Biological Sciences, The Faculty of Sciences, University of Adelaide, 2Intelligent Polymer Research Institute, ARC Centre of Excellence for Electromaterials Science, AIIM Facility, University of Wollongong, 3The Department of Paediatrics, Women's and Children's Hospital, 4 Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital, 5School of Medicine, Faculty of Health Sciences, University of Adelaide, 6Molecular Immunology Group, Robinson Research Institute, University of Adelaide

Introduction: 3D bioprinting allows for the fabrication of complex 3D architectures. 3D bioprinting of regulatory T-cells (Tregs) with islets may overcome inherent immunosuppressive failings in islet transplantation. This project aims to investigate the viability and functionality of bioprinted Tregs and to evaluate the effect of hydrogel modification with IL-2.

Method: Natural Tregs (nTregs) were isolated from human blood by FACS. Induced Tregs (iTregs) were induced from naïve CD4+ T-cells. These cells were either suspended in media (‘non-printed’), or printed in a disc structure with an alginate-gelMA hydrogel then photo- (400nM) and chemically crosslinked with CaCl2. These ‘printed’ cells were recovered by enzymatically dissolving the discs. Viability and Treg functional markers were quantified by flow-cytometry using propidium iodide and anti-LAP, CD69, CD39 and CTLA-4 antibodies.

Results: At day 1, the viability of nTregs decreased by 7% (p<0.0001) and 9% (p<0.0001) while iTreg viability decreased by 4% (p=0.0042) and 6% (p<0.0001), with and without IL-2 respectively, compared to non-printed controls. At day 3, modification with IL-2 significantly improved viability of printed Tregs by 15% (nTreg, p=0.003) and 29% (iTreg, p<0 .0001). Furthermore, no decrease in LAP, CD69, CD39 or CTLA-4 expression was observed upon printing.

Conclusion: Firstly, our data suggests Tregs can be safely bioprinted with minimal impact on viability or functional marker expression. Secondly, we demonstrate that hydrogel modification with IL-2 has a positive impact on the survival of bio-printed Tregs. Finally, this study serves as proof of principle for the capacity of immune cells to survive within printed hydrogel constructs.

Back to Top | Article Outline

Organ Donation and Ethics

CHARACTERISING FAMILY REFUSALS IN SOLID ORGAN DONATION (CREDO) STUDY

SKLIROS Christopher1, DE LA MATA Nicole2, HEDLEY James2, WYBURN Kate1,3, O'LEARY Michael4,5, and WEBSTER Angela2,6

1Sydney Medical School, University of Sydney, 2Sydney School of Public Health, University of Sydney, 3Department of Renal Medicine, Royal Prince Alfred Hospital, Sydney, 4NSW Organ and Tissue Donation Service, 5Intensive Care Service, Royal Prince Alfred Hospital, Sydney, 6Centre for Transplant and Renal Research, Westmead Hospital, Sydney

Background: Understanding the reasons for and characteristics that differentiate between families that consent and refuse organ donation may provide insights into strategies to reduce family refusals.

Aim: To evaluate the characteristics of organ donor referrals whose family refused consent to donation.

Methods: We included all solid organ donor referrals captured by the NSW Organ and Tissue Donation Service (OTDS), 2010–2016. We used descriptive statistics to summarise and compare characteristics of organ donors according to family consent status. Characteristics included donor age, sex, religion, socioeconomic background, ethnicity, reason for family refusal and referring hospital.

Results: There were 3,824 organ donor referrals, consent was sought for 1,927 referrals (Table 1). Nearly half of those where family consent was not sought were aged 65 years or older. Family consent was refused for 831 referrals, the most common reason was the family believing the patient did not want to donate (n=178), followed by family not being prepared to wait (n=77) and family unaware of patient’s wishes (n=57). The majority of referrals whose family consented were of Caucasian background (83%) compared to 56% of referrals whose family refused. Of the families that provided consent, 34% of referrals were listed as having no religion compared to 21% of families that refused consent. However, religion and ethnicity were only routinely collected from 2014.

Conclusions: There is potential to increase organ donors in NSW by reducing family refusals. Discussing organ donation preferences with family members and better understanding cultural or religious barriers may assist in reducing family refusals.

FIGURE 1

FIGURE 1

TABLE 1

TABLE 1

Back to Top | Article Outline

INTRODUCTION OF SHARE 35 INTERREGIONAL ALLOCATION FOR HIGH MELD LIVER TRANSPLANT WAITING LIST PATIENTS IN AUSTRALIA AND NEW ZEALAND

FINK Michael1,2, GOW Paul2, BALDERSON Glenda3, and JONES Robert2,1

1Department of Surgery, University of Melbourne, 2Liver Transplant Unit Victoria, Austin Hospital, Melbourne, 3Australia and New Zealand Liver Transplant Registry

Aims: Patients with high MELD scores awaiting liver transplantation have a high risk of waiting list mortality and a short window of opportunity for rescue. A voluntary trial of sharing of livers between units in Australia and New Zealand for patients with a MELD score ≥ 35 (Share 35) was undertaken. The aim of this study is to assess the impact of the trial on waiting list mortality.

Methods: The waiting list mortality rate of patients whose MELD score reached 35 prior to commencement of the Share 35 trial, “Share 35 candidates”, was compared with that of patients listed as Share 35 patients during the period of the trial, “Share 35 listed”, using Chi square. Post-transplant survival of the two groups was compared using Kaplan-Meier graphs with log-rank.

Results: During the 21-month period of the trial, 24 patients were Share 35 listed, of whom 13 were transplanted with a shipped liver, eight were transplanted with a local donor liver and three died waiting. The waiting list mortality rate of Share 35 listed patients (3 of 24, 13%) was significantly less than that of Share 35 candidates (13 of 27, 48%, P = 0.006). Post-transplant survival was not significantly different between the groups (P = 0.420).

Conclusions: Introduction of Share 35 to Australia and New Zealand has resulted in improved access to liver transplantation for a group of patients that previously were at high risk of waiting list death without adversely affecting utility.

Back to Top | Article Outline

RISK INDICES IN DECEASED DONOR ORGAN ALLOCATION FOR TRANSPLANTATION: REVIEW FROM AN AUSTRALIAN PERSPECTIVE

LING Jonathan1,2, FINK Michael3, WESTALL Glen4, MACDONALD Peter5, CLAYTON Philip6, OPDAM Helen7, HOLDSWORTH Rhonda8, POLKINGHORNE Kevan1, and KANELLIS John1

1Department of Nephrology, Monash Medical Centre, Melbourne, 2School of Medicine, Faculty of Health Sciences, Monash University, Melbourne, 3General and Hepato-Pancreato-Biliary Surgery, Austin Hospital, Melbourne, 4Alfred Hospital, Melbourne, 5St Vincent's Hospital, Sydney, 6Department of Nephrology, Royal Adelaide Hospital, 7Organ and Tissue Authority, 8National Laboratory Manager, Australian Red Cross Blood Service, 9Department of Nephrology, Monash Medical Centre, Melbourne

Recently, organ donation and transplantation rates have increased both worldwide and in Australia. Concurrently, the Australian software used for donor and recipient data management (NOMS) is being rebuilt (called OrganMatch). As an added consequence, organ allocation processes are being reviewed. Worthwhile capabilities of the new software would include the ability to use risk indices to guide organ allocation and help streamline transplantation decisions. Risk indices comprising donor, recipient and transplant factors play an important role in organ allocation policies worldwide by assimilating pertinent data to help guide transplant clinicians.

Aims: To identify risk indices in use worldwide and contrast their use abroad with current Australian organ allocation policies.

Methods and Results: We reviewed risk indices used in organ allocation policies worldwide for kidney, liver, heart, lung and pancreas organs and their predictive capacity for post-transplant outcomes. We collated the Australian organ allocation policies for these organs and have noted the use of similar risk indices where available. Significant donor, recipient and transplant factors used in the scores were summarised.

Conclusions: Risk indices, when used together with other clinical information can assist the organ allocation process. They should not be used in isolation to make decisions regarding transplantation. Very few risk indices are currently part of the current Australian organ allocation process. Any risk index derived abroad needs to be validated in an Australian cohort before use. Modifying or adding variables in a risk index might provide an easier way to update organ allocation policies in the future.

Back to Top | Article Outline

WHAT HAPPENED WHEN THE ‘SOFT OPT-OUT’ TO ORGAN DONATION WAS IMPLEMENTED IN WALES? FAMILY AND PROFESSIONAL VIEWS AND EXPERIENCES, AND CONSENT RATES FOR THE FIRST 18 MONTHS.

NOYES Jane1,2,3, MC LAUGHLIN Leah1,2, MORGAN Karen4, WALTON Phil5, ROBERTS Abigail6, and STEPHENS Michael7

1School of Social Sciences, Bangor University, 2Wales Kidney Research Unit, Bangor University, 3National Centre for Population Health and Well being Research, 4Major Health Conditions Policy Team, Welsh Government, 5Department of Organ Donation, NHS Blood and Transplant, 6North West Regional Office, Liverpool, UK, NHS Blood and Transplant, 7Department of Nephrology and Transplantation, Cardiff and Vale University Health Board, University Hospital of Wales, Cardiff, UK.

Introduction: On 01.12.15 Wales introduced a 'soft opt-out' system of organ donation.

Methods: Co-productive, mixed-methods study partnered with National Health Service Blood and Transplant and patient and public representatives. Data were collected on all 211 approaches between 01.012.15-31.05.17: 182/211 deceased patients came under the Act. Depth data (62 interviews with 85 family members, and questionnaires) on 60 patients who were potential/actual organ donors; and 2 focus group/individual interviews with 19 NHS BT professionals [figure1]. Organ Donor Register (ODR) activity was monitored.

Results: Welsh consent rates increased by around 10% to 61%; 64% when family consent was removed. This was higher than England and has reversed an unexplained drop to 48.5% before implementation. However, family member(s) still overrode the patient’s organ donation decision 31/205 times. 46/205 cases had their consent deemed with a consent rate of 61%. The Act provided a useful framework but family members did not fully understand deemed consent. Negative personal organ donation views and health systems issues affected support for organ donation. The media campaign missed the changed role of the family; that they were no longer the decision maker about organ donation. ODR 'opt-outs' were 6%, less than anticipated.

Discussion: The media campaign mostly worked but was not memorable and had gaps. More work is needed to inform the family about their changed role. Scotland and England are now in the consultation process to move to an ‘opt-out’ system. As a result of this study Welsh Government commissioned a new campaign launched 01.11.17.

Figure

Figure

Back to Top | Article Outline

PROMOTING DECEASED DONOR ORGAN TRANSPLANTATION IN VIETNAM: WHERE TO START?

ALLEN Richard1,2, PLEASS Henry3,4, KABLE Kathy5, ROBERTSON Paul6, MACKIE Fiona7, THOMAS Gordon8, SINH Tran Ngoc9, PHAM GIA Khanh10, and TRUONG Nguyen11

1Westmead Clinical School, University of Sydney, 2Transplantation Services, Royal Prince Alfred Hospital, Sydney, 3Department of Surgery, University of Sydney, 4National Pancreas Transplant Unit, Westmead Hospital, Sydney, 5Renal & Transplantation Unit, Westmead Hospital, Sydney, 6Renal Transplant Unit, Westmead Hospital, Sydney, 7Department of Pediatrics, Prince of Wales Hospital, Sydney, 8Department of Surgery, Sydney Children's Hospital, 9Renal Transplant Unit, Cho Ray Hospital, 10Hanoi Medical University, Vietnam, 11Transplantation Services, Cho Ray Hospital, HCMC, Vietnam

Aim: Vietnam is a developing country of 93,000,000 people with central government and wide disparities in wealth, education and healthcare. We review organ transplant activity and describe challenges for deceased organ donation (DD).

Materials and Methods: National registry relies on self-reporting from 17 kidney, 3 liver and 3 heart units.

Results: 2,249 living donor (LD) transplants, 174 DBD transplants and 3 DCD transplants have been reported. No child has received a deceased adult organ. Only 3 kidney centers presented data at Vietnamese Society of Transplantation (VSOT) 2017 meeting, reporting unrelated kidney LD activity of 6.3% (Figure), 71.4% and 85.7% respectively with latter two relying on police determination that unrelated donors were not rewarded. Barriers to DD exist despite DD legislation and >12,000 annual head injury deaths. Brain death diagnosis is complex. Family consent for DD is impeded by immense clinical pressures and limited resources. Requests are cursory without consensus for organ allocation. Results are not published. Wait-listing with stored sera and environment of trust between ICU and transplant surgeons do not exist. Transplant training from Europe and SE Asia is based on surgical skills for elective procedures. Careers for transplant physicians and nurses for recipient preparation and long-term care are limited.

Conclusions: Potential exists to improve DD activity with simultaneous 1. Cost-effective local resourcing of ICUs; 2. Transparent allocation guidelines and waiting-list criteria led by VSOT (±TSANZ); and, 3. Co-ordinated international hospital partnerships. Subsequent growth of heart, liver and paediatric transplantation will enhance community and donation sector appreciation of DD.

FIGURE 1

FIGURE 1

Back to Top | Article Outline

THE WEEKEND EFFECT: AN AUSTRALIAN COHORT STUDY ANALYSING TEMPORAL TRENDS IN SOLID ORGAN DONATION

WEBSTER Angela1,2, HEDLEY James1, CHANG Nicholas1, ROSALES Brenda1, WYBURN Kate3,4, KELLY Patrick1, OLEARY Michael5, and CAVAZZONI Elena5

1School of Public Health, University of Sydney, 2Centre for Transplant and Renal Research, Westmead Hospital, Sydney, 3School of Medicine, University of Sydney, 4Renal Unit, Royal Prince Alfred Hospital, Sydney, 5Donate Life, DonateLife

Introduction: A US study suggested donation rates were poorer on weekends. We investigated the effect of day-of-week of donor referral on organ donation in Australia.

Methods: In Australia, potential donor referrals are made to a state-based Donation Service, who then simultaneously seek family consent for donation and assess the medical suitability of the referral for donation. Organ retrieval occurs when utilisation is almost certain, hence discard rates are extremely low. We retrospectively reviewed all New South Wales referral logs from 2010 to 2016. Our outcomes were actual donation (retrieval), family consent, and medical suitability. We used logistic regression with random effects adjusting for clustering of referral hospitals. We used mortality data from the Australian Institute of Health and Welfare to compare donation referrals to background mortality rates by day of the week for all-cause mortality and from motor vehicle accident deaths (MVA).

Results: Of 3,383 referrals (potential donors), 692 (20%) became actual donors. We found no evidence of reduced donation (adjusted OR: 1.15; 95% CI 0.93 – 1.42; p=0.2), consent (adjusted OR 1.07; 95% CI 0.85-1.35; p=0.6), or medical suitability (adjusted OR 1.15; 95% CI 0.96-1.39; p=0.1) among weekend referrals. The rate of donor-referral was lower on weekends compared to weekdays for all-cause mortality (p<0.001) and MVA mortality (p=0.03).

Conclusion: There was no association between day-of-the-week or weekend referrals and actual donation, family consent, or medical suitability. There was some indirect evidence that donor referrals may be more selective at weekends. These results contrast findings from the USA.

(ctd)

TABLE 1

TABLE 1

Back to Top | Article Outline

OVERCOMING BARRIERS FOR INDIGENOUS AUSTRALIANS GAINING ACCESS TO THE KIDNEY TRANSPLANT WAITING LIST

ATKINSON Amy FORD Sharon GOCK Hilton IERINO Frank and GOODMAN David

Department of Nephrology, St Vincent's Hospital, Melbourne

Aims: Aboriginals constitute 10% of the Australian dialysis population but few ever receive a kidney transplant. We studied our dialysis and transplant population to identify the main barriers to Victorian patients being listed for transplantation.

Methods: All Aboriginal patients on dialysis (n=12) or with kidney transplant (n=7) were included in the study. Information was derived from the hospital records and interview by study nurse.

Results: Twelve of 304 current dialysis patients (3.9%), mean age 59 (range 39–80), 6 male & 6 females, living in Melbourne 6 & Country Victoria 6, had mean dialysis 4.8 years (range 1–11 years). Only 1 had previously been on active list. Ten of 12 have diabetes mellitus, 5 ischaemic heart disease, 4 ex-IVDU, 2 mental illness, 2 BMI>35, 1 foot ulceration, 1 osteomyelitis, 1 bacterial endocarditis, 1 recurrent pneumonia and 1 recent colon cancer. Only 1 patient, a smoker/drug user regularly missed dialysis. One patient has declined transplant work up and another had previously done so. Seven of 265 receiving a kidney transplant over the past 10 years (2.6%) waited on average 5 years from dialysis commencement to transplantation.

Conclusion: Medical co-morbidities including heart disease, infection and psycho-social issues are the main barriers to transplant listing. Concerted efforts to manage medical issues involving a multidisciplinary team of transplant physicians and nurses, GP’s, Aboriginal liaison officers and social workers may allow more Aboriginals to be listed for transplantation. Once listed the current organ match system appears to provide equal access to kidneys for all Australians.

Back to Top | Article Outline

EXPLORING THE IMPACT OF RECIPIENT AGE WITH KIDNEY DONOR RISK INDEX AND ESTIMATED GLOMERULAR FILTRATION RATE AT ONE YEAR FOLLOWING KIDNEY TRANSPLANT

CHAN Samuel1,2,3, CHATFIELD Mark4, and BABOOLAL Keshwar1,2

1Department of Nephrology, Royal Brisbane Hospital, 2Department of Medicine, University of Queensland, Brisbane, 3ANZDATA, 4Statistics Unit, QIMR Berghofer Medical Research Institute

Background: Various kidney donor risk indexes (KDRI) have been developed to predict graft survival with various combinations of donor and recipient characteristics. The aim of this study was to;

1. Explore relationships between Rao KDRI and recipient estimated glomerular filtration rate (eGFR) at 1yr

2. Examine the impact of recipient age on eGFR at 1yr

Methods: A retrospective analysis of deceased donor and recipient data from Australian, New Zealand, UK and USA Organ Donor Registries was conducted from 2000-2015. KDRI and recipient age was categorised into four and six groups, respectively. Median eGFR was calculated for the 24 combinations of age and KDRI.

Results: Overall, there were 6,512 Australian, 851 New Zealand, 21,077 UK and 157,664 USA recipients (median age 52yrs, [IQR 41-61]). Recipients aged <30yrs receiving a good-quality kidney (KDRI<1.0) achieved a higher median eGFR at 1yr (87.4ml/min/1.73m2) compared with other age groups (median eGFR range: 56.0-63.3ml/min/1.73m2). Recipients aged <30yrs receiving an average-quality kidney (KDRI 1.0-1.5) yielded a better median eGFR of 67.1ml/min/1.73m2 compared with other age groups (median range: 47.8-52.6 ml/min/1.73m2). Recipients aged <30yrs receiving a marginal-quality kidney (KDRI 1.5-2.0) achieved a median eGFR of 53.7ml/min/1.73m2 compared with other age groups (median range: 39.2-44.4ml/min/1.73m2). Recipients aged <30yrs receiving a poor-quality kidney (KDRI>2.0) yielded a better median eGFR of 45.3ml/min/1.73m2 compared with other ages (range: 35.1-36.2 ml/min/1.73m2).

Conclusions: As KDRI increases, eGFR decreases. Recipients aged <30yrs achieved a substantially higher eGFR at 1yr, independent of donor quality. Only small differences in median eGFR were seen between other age groups.

Back to Top | Article Outline

NORMOTHERMIC MACHINE PERFUSION OF NON-UTILIZED HUMAN KIDNEYS – OUR FIRST TWO CASES

HAMEED Ahmer1,2, ROGERS Natasha1,3, ROO Ronald DE2, Bo LU1, ROBERTSON Paul3, ZHANG Chris3, GASPI Renan3, MIRAZIZ Ray4, NGUYEN Hien2, YUEN Lawrence2,5, ALLEN Richard2,1,5, HAWTHORNE Wayne1,2, and PLEASS Henry2,5

1Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 2Department of Surgery, Westmead Hospital, Sydney, 3Department of Renal Medicine, Westmead Hospital, Sydney, 4Department of Anaesthetics, Westmead Hospital, Sydney, 5School of Medicine, University of Sydney

Aims: Normothermic machine perfusion (NMP) is an emerging modality that may improve graft function of higher KDPI kidneys and/or reduce kidney discard rates. We aimed to test this modality in a series of human kidneys deemed unsuitable for transplantation.

Methods: The first kidney was from a 74 year-old male proceeding down the DBD pathway (KDPI 96%). Both kidneys were not utilized due to suspected intra-abdominal malignancy noted during the donation procedure. The second kidney was from a 71 year-old female proceeding down the DCD pathway (KDPI 89%). The right kidney was deemed unsuitable for transplantation due to very poor/patchy perfusion. Both kidneys were transported to our centre in standard cold preservation solution for subsequent NMP. Perfusion parameters, renal function, and histology (haematoxylin & eosin [H&E] section) were compared over this time period.

Results: NMP was performed for 3 hours using the left kidney from the first donor; the second donor’s kidney underwent NMP for 2 hours. Both kidneys displayed a significant improvement in perfusion parameters over time, with a drop in intra-renal resistance and increase renal blood flow. The grossly non-perfused regions in the DCD kidney were no longer evident. Both kidneys produced urine (2 ml/hr, donor 1; 22 ml/hr, donor 2). Creatinine clearance and tubular function (FeNa) improved over time, especially in donor 2. Sequential histology revealed no significant deterioration in renal tubular and glomerular cyto-architecture after 2-3 hrs of NMP.

Conclusions: NMP is a promising modality that has the potential to resuscitate grafts and thereby maximize kidney utilization.

Back to Top | Article Outline

CHALLENGES TO PROCEEDING TO ORGAN DONATION IN THE NORTHERN TERRITORY

MCAULIFFE Kathryn WOOD Lee and JONES Sarah

DonateLife NT, DonateLife

Aims: Organ donation in the Northern Territory (NT) remains infrequent despite extensive efforts to improve community education and awareness. We aimed to examine the challenges to it occurring.

Methods: All referrals to the DonateLife NT agency between January 2014 and December 2017 were reviewed. Referral numbers increased year on year. We looked at consent rates, ethnic group, registration on the Australian Organ Donor Register (AODR) and reasons for referrals not proceeding.

Results: There were 142 referrals over the four year period. The mean age was 48.6 years (range 2 months to 82 years). Sixty-three (44.3%) of referrals were Indigenous, 54 were Caucasian (38%) and 25 (17.6%) were from a different culturally and linguistically diverse (CALD) background. Of the 142 referrals, only 55 (38.7%) proceeded to the Family Donation Conversation (FDC). Consent for organ donation was obtained from 25 (45%), 20 of whom became organ donors. There were 5 intended donors. Only 14.5% (8) of all referrals that proceeded to FDC were registered on the Australian Organ Donor Register (AODR). Of the 87 referrals that did not proceed to FDC, 45 (51.7%) were deemed either medically unsuitable or medically unsupportable. Diabetes, hypertension and hazardous alcohol use were common comorbidities amongst medically unsuitable patients.

Conclusions: Organ donation poses many challenges within the NT which require ongoing attention. Although patients are young, medical suitability issues often prevent conversations about organ donation from taking place. Registration rates on the AODR are also low.

Back to Top | Article Outline

IMPACT OF A DEDICATED LIVING DONOR CLINIC AND ASSESSMENT TEAM: A SINGLE CENTRE EXPERIENCE

SANDIFORD Megan1, COOK Natasha1, WHITLAM John1, CHAN Yee2, KAUSMAN Joshua3, IERINO Frank4, and LEE Darren3,4,1,5

1Department of Nephrology, Austin Hospital, Melbourne, 2Department of Urology, Austin Hospital, Melbourne, 3Department of Nephrology, Royal Children's Hospital, Melbourne, 4Department of Nephrology, St Vincent's Hospital, Melbourne, 5Eastern Health, Melbourne

Aims: International guidelines recommend independent assessment of living kidney donor candidates (LKDC) by nephrologists not involved in the care or evaluation of the intended recipients. Whether this approach might have a negative impact on the determination of suitability and timely living donor (LD) transplant is unclear. We examined the efficiency of LKDC assessment before and after the establishment of a dedicated LD clinic staffed by a LD coordinator and two nephrologists.

Methods: We retrospectively compared the number of renal clinic appointments attended by LKDC to determine medical suitability and proportion of pre-emptive LD transplants pre-LD clinic (January 2006 to October 2009) and post-LD clinic (November 2009 to Oct 2017) establishment at Austin Health. LKDC with part of their assessment performed elsewhere were excluded.

Results: In the post-LD clinic, fewer clinic appointments were required to determine suitability for both accepted and declined LKDC (Table). For accepted LKDC, a further reduction in clinic appointments was observed in the second versus first half of the post-LD clinic era (median 2 (IQR 2-3) vs 4 (3-4.75); P<0.001), suggesting a learning curve for improvement. An increase in pre-emptive LD transplant rate also occurred (46.7% vs 14.3%). The likelihood of LKDC being accepted and LD transplant rate did not significantly change over the two eras.

Table

Table

Conclusions: Our experience suggests that LD clinic staffed by a dedicated assessment team improves the efficiency of LKDC assessment and facilitates pre-emptive LD transplantation. This allows the development of expertise and quality improvement without altering the acceptance threshold for medical suitability.

Back to Top | Article Outline

CLINICIAN’S ATTITUDES AND PERSPECTIVES ON THE ACCEPTABILITY OF ANTE-MORTEM INTERVENTIONS: AN INTERNATIONAL SEMI-STRUCTURED INTERVIEW STUDY

SHAHRESTANI Sara1,2, HAWTHORNE Wayne1,3, PLEASS Henry3, WONG Germaine4, and TONG Allison5

1Centre for Transplant and Renal Research, Westmead Hospital, Sydney, 2School of Medicine, University of Sydney, 3Department of Surgery, Westmead Hospital, Sydney, 4Department of Renal Medicine, Westmead Hospital, Sydney, 5Centre for Kidney Research, The Children's Hospital at Westmead, Sydney

Background: The use of ante-mortem interventions in transplantation remains contentious due to ethical concerns and potential for harm to donors. There is variability in the acceptance and use of ante-mortem interventions across individual centers.

Methods: We conducted semi-structured interviews with 42 clinicians (transplant physicians, surgeons, ICU physicians, and donation specialist nurses), purposively sampled from eight countries including Australia, Italy, Japan, Korea, United Kingdom, United States, New Zealand, and Vietnam. We used thematic analysis to analyse the data.

Results: Four themes were identified: respecting the donor family’s experience of grief; optimising ‘the gift’ as a duty to the donor; ambiguity in operationalising ‘informed’ consent, and fears of harming the donor. Participants feared burdening the grieving family with organ donation, in often-traumatic circumstances and the donation specialist role was necessary for sensitive discussion of wishes. Clinicians felt a tension between their duty to enact donor wishes, protect donors as ‘patients in their own rights,’ and prevent ‘unsuccessful’ transplantation. The complete dissemination of information for consent in a time-pressured and emotionally-charged context was described as unrealistic. Instead, the legal concept of ‘authorisation,’ with less onus of information, was raised. The principle of ‘first do no harm’ applied to the potential harms of interventions and adhering to the donor’s wishes. Conclusions: Respect for the rights and wishes of donors, minimisation of harm and optimisation of ‘the gift’ were paramount to clinicians. Clarity around what constitutes ‘benefit’ and ‘harm,’ along with informed discussion with families, will help clinicians resolve tensions regarding the acceptability of interventions in the donation process.

Back to Top | Article Outline

EXTENDED CRITERIA DONATION UNDER EXTENDED CRITERIA CIRCUMSTANCES

THOMPSON Sophie1, PILCHER David2, and IHLE Joshua3

1DonateLife Victroia, Alfred Hospital, Melbourne, 2DonateLife Victoria, Alfred Hospital, Melbourne, 3Intensive Care Unit, Alfred Hospital, Melbourne

Introduction: In Donation after Circulatory Death (DCD), withdrawal of cardiorespiratory support (WCRS) usually begins with extubation. Death is most commonly determined after a period of 5 minutes’ observation of a non-pulsatile invasive arterial pressure trace. We present a case of a patient who wished to be a donor but had neither invasive arterial monitoring, nor mechanical ventilation.

Case presentation: A 61-year-old woman was admitted to the Intensive Care Unit for management of respiratory failure following a lung transplant. After a prolonged hospital stay, the patient’s care transitioned to end of life treatment. Consent was obtained from the senior available next of kin for organ donation and a rapid retrieval DCD protocol was activated. The patient was receiving oxygen via High Flow Nasal Prongs (HFNP) and was monitored only via a pulse oximeter. 23 minutes after removal of HFNP, oxygen saturation was no longer recordable. 31 minutes later the treating Intensivist examined the patient to confirm death. Time from removal of HFNP to cold perfusion of the organs was 54 minutes. Both kidneys were subsequently transplanted into recipients who are recovering well.

Discussion: This was a unique situation where an extended criteria donor was able to donate organs due to a combination of being registered, family being aware and supportive of her wishes in a situation where donation would not usually be considered. Dependence on HFNP enabled death to occur in a timely fashion after WCRS, resulting in kidney donation via a rapid retrieval DCD pathway.

Back to Top | Article Outline

ALLOCATION OF DECEASED DONOR KIDNEYS IN CLINICAL PRACTICE: MATCHING GRAFT LIFE-YEARS AND RECIPIENT LIFE EXPECTANCY

YONG Bryan1,2,3, IERINO Frank2, PAIZIS Kathy1, and POWER David1

1Renal & Transplantation Unit, Austin Hospital, Melbourne, 2Renal & Transplantation Unit, St Vincent's Hospital, Melbourne, 3School of Medicine, University of Melbourne

Introduction: The current allocation of deceased donor kidney organs in the Australia National Organ Matching System (NOMS) does not match expected graft life-years to patient life-expectancy. Such longevity mismatches between donor grafts and recipients may result in loss of potential graft life-years, which is known to confer improved recipient life-expectancy. Evaluating the matching of graft and recipient longevity with the current NOMS allocation algorithm is an essential step towards the formal introduction of donor-recipient matching.

Aims: To determine the correlation between deceased donor kidney graft longevity and recipient life expectancy.

Methods: Adult deceased donor kidney transplants (n = 125) from a single centre from 2011 to 2015 were examined retrospectively. Two validated clinical calculators, the Australian Kidney Donor Profile Index (KDPI) and Expected Post-Transplant Survival (EPTS) were used to predict graft and patient survival respectively. Data for the KDPI and EPTS parameters were collected from the clinical notes and transplant registry databases. The statistical relationship between KDPI and EPTS scores was then correlated using the Spearman rank correlation. Results: A statistical analysis between KDPI and EPTS demonstrated a poor correlation with a Spearman rank correlation of 0.179 (CI 95% 0.006-0.361, p-value = 0.046).

Conclusion: Current practices do not optimise the matching of organ expected graft life-years to recipient life expectancy, leading to loss of potential graft life-years. This supports the need to improve longevity matching of the donated kidney to the estimated life expectancy of the potential recipient.

Back to Top | Article Outline

Immunobiology

THE ROLE OF THE CD73/A2A SIGNALLING AXIS IN A HUMANISED MOUSE MODEL OF GRAFT-VERSUS-HOST DISEASE

GERAGHTY Nicholas1,2,3, ADHIKARY Sam1,2,4, SLUYTER Ronald1,2,3, and WATSON Debbie1,2,3

1School of Biological Sciences, University of Wollongong, 2Centre for Medical and Molecular Biosciences, University of Wollongong, 3Illawarra Health and Medical Research Institute, University of Wollongong, 4Illawarra Health and Medical Research Institute

Graft-versus-host disease (GVHD) is a complication that occurs in approximately 50% of bone marrow transplantations, due to donor leukocytes (predominantly T cells) in the graft mounting an immune response against the patient (host). Extracellular adenosine, generated by the ecto-enzyme CD73, activates A2A to limit T cell responses. CD73 or A2A blockade worsens disease, while A2A activation reduces disease in allogeneic mouse models of GVHD.

Aim: The current study aimed to investigate the role of the CD73/A2A signalling axis in a humanised mouse model of GVHD.

Methods: NOD-SCID-IL2γnull (NSG) mice injected with 10 x 106 human (h) peripheral blood mononuclear cells (PBMC), were subsequently injected with αβ-methyleneADP (APCP) (CD73 antagonist) or CGS21680 (A2A agonist) or control diluent for 14 days. GVHD development was assessed by weight loss, clinical parameters, and survival. The impact of APCP and CGS21680 on immune cells and cytokines were investigated.

Results: CD73 blockade enhanced weight loss but did not alter clinical score or survival. CD73 blockade increased serum human interleukin (IL)-2 concentrations. A2A activation increased weight loss, but did not impact clinical score or survival. CGS21680 led to a decrease in immunosuppressive regulatory T cells, however serum tumor necrosis factor (TNF)-α and IL-2 were reduced, and IL-6 was increased.

Conclusion: A2A activation represents a potential therapeutic target for GVHD due to reduced inflammation, but should be carefully considered due to the negative effects on weight loss and regulatory T cells. Therefore, further investigation into A2A activation is warranted before it can be used as a therapeutic strategy for GVHD in humans.

Back to Top | Article Outline

CHANGES IN THE EXTRACELLULAR MATRIX - SIGNS OF REMODELING LEADING TO CHRONIC REJECTION AFTER LUNG TRANSPLANTATION

MULLER Catharina1, HEINKE Paula1, ANDERSSON-SJÖLAND Annika1, SCHULTZ Hans Henrik2, ANDERSEN Claus3, IVERSEN Martin2, WESTERGREN-THORSSON Gunilla1, and LEIF Eriksson1

1Experimental Medical Science, Lund University, Sweden, 2Section for lung transplantation, Copenhagen University Hospital, Denmark, 3Department of pathology, Copenhagen University Hospital, Denmark

Background: About 50% of lung transplanted patients develop chronic rejection in the form of bronchiolitis obliterans syndrome (BOS) within 5 years after transplantation. BOS is characterized by a decrease in lung function, caused by progressive fibrosis. However, little is known about its initiation. We hypothesize that changes in the distribution of extracellular matrix proteins might be a marker for the disease process.

Methods/Material: Our study aimed to map total collagen, collagen type IV, biglycan and periostin in transbronchial biopsies taken at 3 and 12 months after transplantation using Masson’s Trichrome staining and immunohistochemistry. Staining patterns were quantified and related to patient data (n=58) in a 5-years follow-up.

Results: Compartment specific patterns could be revealed between 3 and 12 months post-transplantation. Alveolar total collagen (p=0.019) and small airway biglycan (p=0.02) increased in BOS-developing patients. Alveolar collagen type IV increased in BOS-free patients (p=0.01) (3 vs. 12 months). Individual calculation of the change in protein content (12 minus 3 months for the respective patient) confirmed the increase in biglycan (p=0.012) and showed a trend for increased periostin (p=0.057) in the small airways of BOS patients compared to BOS-free patients. Already at 3 months, before onset of BOS, increased total alveolar collagen (p=0.036) and small airway collagen type IV (p=0.034) could discriminate between patients developing less severe and severe forms of BOS (BOS grade 1+2 vs. 3).

Conclusion: The results show distinct alterations of the extracellular matrix which might be part of the complex remodeling processes that eventually lead to BOS.

Back to Top | Article Outline

MTORC2 DEFICIENCY IN DENDRITIC CELLS PROMOTES ACUTE KIDNEY INJURY

ROGERS Natasha1,2, DAI Helong2, WATSON Alicia2, and THOMSON Angus2

1Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 2Starzl Transplant Institute, University of Pittsburgh

Introduction: The role of the mammalian/mechanistic target of rapamycin (mTOR) in the pathophysiology of acute kidney injury (AKI) is poorly characterized. Furthermore, the influence of dendritic cell (DC)-based alterations in mTOR signalling in AKI has not been investigated.

Methods: Bone marrow-derived mTORC2-deficient (Rictor-/-) or wild-type (WT) DC underwent hypoxia-reoxygenation and were analysed by flow cytometry. Age- and gender-matched DC-specific Rictor-/- mice or littermate controls underwent bilateral renal ischemia-reperfusion injury followed by assessment of renal function, histopathology, renal DC metabolism, bio-molecular and cell infiltration analysis. Adoptive transfer of WT or Rictor-/- DC to C57BL/6 mice was used to assess migratory capacity.

Results: AKI upregulates expression of phospho-S6K (downstream of mTORC1), but downregulates phosphorylated Akt S473 (downstream of mTORC2) in whole kidney tissue. Rictor-/- DC expressed more CD80/CD86 but less programmed death ligand-1 (PDL1) that was enhanced by hypoxia-reoxygenation, and demonstrated enhanced migration to the injured kidney. Following AKI, Rictor-/-DC mice developed higher serum creatinine, more severe histologic damage, and greater pro-inflammatory mRNA transcript profiles of IL-1β, IL-6 and TNF-α compared to littermate controls. A greater influx of neutrophils and T cells was seen in Rictor-/- DC mice, in addition to CD11c+MHCII+CD11bhiF4/80+ renal DC, that expressed more CD86 but less PDL1. Rictor-/- DC showed increased TNF-α but significantly reduced IL-10 production, and were glycolytically biased compared to WT DC under both basal and AKI conditions.

Conclusions: These data suggest that mTORC2 signaling in DC negatively regulates AKI, highlighting the regulatory roles of both DC and Rictor in the pathophysiology renal injury.

Back to Top | Article Outline

TISSUE-RESIDENT LYMPHOCYTES IN SOLID ORGAN TRANSPLANTATION

PROSSER Amy1,2, HUANG Wen Hua3, LIU Liu1, LARMA-CORNWALL Irma4, JEFFREY Gary1, GAUDIERI Silvana2, DELRIVIERE Luc5,3, KALLIES Axel6, and LUCAS Michaela1,7

1Medical School, University of Western Australia, Perth, 2School of Anatomy, Physiology and Human Biology, University of Western Australia, Perth, 3School of Surgery, University of Western Australia, Perth, 4Centre for Microscopy, Characterisation and Analysis, University of Western Australia, Perth, 5WA Liver & Kidney Transplant Service, Sir Charles Gairdner Hospital, Perth, 6Department of Microbiology and Immunology, The Peter Doherty Institute for Infection and Immunity, 7Department of Immunology, Sir Charles Gairdner Hospital, Perth

Introduction: Solid organ transplantation is the standard treatment option for many patients with end-stage diseases. Despite improvements in short-term outcomes, long-term organ graft survival has remained poor for the past two decades. Newly characterised tissue-resident lymphocytes are suspected to play a significant role in graft survival and rejection, although their function in transplantation has not yet been tested. Similarly, the contribution of donor- and recipient-derived lymphocytes to allograft survival and rejection has not been investigated.

Methods: We have performed orthotopic liver transplants in either congenic or MHC mismatched mice. At various timepoints up to one month post-surgery, rejection of the organ was scored histologically and donor- and recipient-derived lymphocytes were analysed in the graft and peripheral organs by flow cytometry. The maintenance and differentiation to a tissue-resident phenotype of various cellular subsets was also assessed.

Results: Tissue-resident lymphocytes were successfully transplanted with the liver, with long-term survival of these cells observed only in a congenic transplantation context. MHC mismatch of donor and recipient mice, however, led to severe rejection and rapid depletion of most donor cells. Vast numbers of recipient lymphocytes also quickly infiltrate the allograft and upregulate markers associated with tissue-residency.

Conclusions: Donor-derived tissue-resident lymphocytes in the murine liver are readily transferrable with whole liver transplantation. Depletion of these cells in MHC mismatched transplants and infiltration of recipient lymphocytes differentiating to a tissue-resident phenotype coincided with severe rejection of the allograft. This suggests tissue-residency of lymphocytes, whether donor- or recipient-derived, is important in the context of solid organ rejection.

Back to Top | Article Outline

Outcome Measures

OUTCOMES OF WESTERN AUSTRALIAN LUNG TRANSPLANT RECIPIENTS – THE FIRST DECADE

DHILLON Sarbroop1, MCKINNON Elizabeth2, MUSK Michael3, WROBEL Jeremy3, LAVENDER Melanie3, and GABBAY Eli4

1Fiona Stanley Hospital, 2Institute for Immunology and Infectious Diseases, Murdoch University, 3Lung Transplant Service, Fiona Stanley Hospital, 4School of Medicine, University of Notre Dame

Background: Lung transplantation has evolved into an effective treatment option for end-stage lung disease. Growing local demand prompted the establishment of the Advanced Lung Disease Unit at Royal Perth Hospital in Western Australia in 2004. Operating now for just over a decade and recently relocated to Fiona Stanley Hospital, we sought to assess our recipient characteristics and outcomes, and compare ourselves to the international standard.

Method: Basic characteristics of all transplant recipients between 2004 and 2015 were collected at the time of transplant. This data was retrospectively augmented from our electronic hospital medical records system. Survival analysis was performed using the Kaplan-Meier method.

Results: A total of 115 lung transplants were performed. Transplant rates have trended upwards over the years, with 20 lung transplants performed in 2015. Half the recipients were over the age of 50. The most common indications for transplant, each accounting for a quarter of total transplants, were Cystic Fibrosis, Interstitial Pulmonary Fibrosis and Chronic Obstructive Pulmonary Disease. Overall survival rates were 96% at 3 months, 93% at 1 year, 84% at 3 years, and 70% at 5 years (Figure 1). This compares well to international survival rates, published by the International Society of Heart and Lung Transplantation, of 89% at 3 months, 80% at 1 year, 65% at 3 years, and 54% at 5 years.

FIGURE 1

FIGURE 1

Conclusion: Lung transplants rates continue to rise and our patients enjoy international standard outcomes.

Back to Top | Article Outline

ALLOGRAFT OUTCOME FOLLOWING RETRANSPLANTATION OF PATIENTS WITH FAILED FIRST KIDNEY ALLOGRAFT ATTRIBUTED TO NON-ADHERENCE

MANICKAVASAGAR Revathy1, WONG Germaine2,3,4, and LIM Wai H5,6

1Renal Transplant Unit, Sir Charles Gairdner Hospital, Perth, 2Centre for Transplant and Renal Research, Westmead Hospital, Sydney, 3Centre for Kidney Research, The Children's Hospital at Westmead, Sydney, 4School of Public Health, University of Sydney, 5Renal & Transplantation Unit, Sir Charles Gairdner Hospital, Perth, 6School of Medicine & Pharmacology, University of Western Australia, Perth

Background: It remains unknown whether first allograft failure secondary to non-adherence leads to increased risk of allograft failure following retransplantation.

Aim: To determine the association between causes of first allograft failure and outcomes following retransplantation.

Materials and Methods: Using the ANZDATA Registry, patients who had received a second kidney transplant between 1960-2014 were included. The association between causes of first allograft failure, death censored graft failure (DCGF) and non-adherence-related DCGF following retransplantation were examined using Cox regression and competing risk analyses.

Results and Discussion: Of 2822 patients who have received second kidney allografts, 59 (2%) lost their first allografts from non-adherence. Patients who had non-adherance-related first graft failure were younger at the time of first allograft failure (median 25 vs 38 years, p<0.001) and had significantly longer waiting times for retransplantation (waiting time >5 years: 57% vs. 20%, p<0.001) compared with those who lost their first graft from other causes. The adjusted HR for DCGF was 0.76 (95%CI 0.44, 1.32; p=342) for those who had lost their first allograft from non-adherence. Following retransplantation, the adjusted subdistribution HR of second allograft failure attributed to non-adherence for patients who had experienced non-adherence-related first allograft failure was 2.84 (95%CI 0.83, 17.79; p=0.082).

Conclusion: In patients who had experienced non-adherence-related first allograft failure, the long-term risk of DCGF in the second allograft was similar to those who had lost their first allografts from other causes. Non-adherence-related allograft failure should not be considered a contraindication to successful retransplantation.

Back to Top | Article Outline

PROPHYLACTIC PLASMA EXCHANGE IS ASSOCIATED WITH A HIGH INCIDENCE OF AMR IN SENSITISED RECIPIENTS

CHAMBERLAIN AJ1, SNIDER J2, POWER DA2, and JB WHITLAM2

1Austin Hospital, Melbourne, 2Department of Nephrology, Austin Hospital, Melbourne

Aims: Thresholds for peri-operative plasma exchange (PPEX) in recipients with donor specific antibody (DSA) and negative crossmatch are not clear. We sought to review indications for and outcomes following PPEX at our centre.

Methods: All adult kidney transplant recipients who received PPEX between 2012 and 2016 were identified. Demographic, immunologic, clinical and plasma exchange treatment data were collected from the clinical record.

Results: 63/251 (24%) recipients received PPEX. Indications were DSA (78%), ABO incompatibility (14%), DSA+ABO incompatibility (3%), and other (5%). Of 51 recipients with DSA who received PPEX, number of DSAs was 1 (57%), 2 (31%), 3 (10%) and 4 (2%). DSA target was class I (50%), class II (36%) and class I+II (14%). The median maximum recipient DSA mean fluorescence intensity (MFI) was 1659 (IQR 1090-2789). 57% of maximum DSA MFI were < 2000. The median number of PPEX treatments was 6 (IQR 4-8). 43% developed antibody mediated rejection (AMR) at median of 40 (IQR 9-269) days post-operatively. Development of AMR was not predicted by DSA number, class or MFI. Time to AMR was predicted by DSA class (class I+II = 6 days, IQR 5-9; class II 15, IQR 9-123; class I 40, IQR 19-207; p=0.03), but not DSA number or MFI.

Conclusions: In this cohort of kidney transplant recipients who received PPEX for relatively low risk HLA sensitisation, development of AMR was common and not predicted by traditional indicators for PPEX. The optimal use of PPEX in this setting is yet to be defined.

Back to Top | Article Outline

POST-TRANSPLANT SURVIVAL IN TYPE 1 DIABETICS IN AUSTRALIA AND NEW ZEALAND

WEBSTER Angela1,2, HEDLEY James1, and Patrick KELLY1

1School of Public Health, University of Sydney, 2Centre for Transplant and Renal Research, Westmead Hospital, Sydney

Introduction: We analysed data from the Australian and New Zealand Pancreas and Islet Transplant Registry (ANZIPTR) as well as the Australian and New Zealand Dialysis and Transplant Registry (ANZDATA) to estimate differences in transplant and patient survival by transplant type among recipients with type 1 diabetes.

Materials and Methods: We conducted an inception cohort study from 1984-2012, using data linkage of ANZIPTR and ANZDATA. We compared kidney graft and patient survival from the date of transplant for SPK and deceased-donor KTA recipients using Cox regression, and censored patients at last known follow-up. We adjusted for age, sex, state/country, previous transplants, age difference between recipient and donor, and immunosuppression used. To meet the proportional hazards assumption we stratified by era (1984-1999, 2000-2012).

Results: We included 1,090 transplant recipients (462 SPK, 493 deceased donor kidney, 135 living donor kidney). SPK had improved kidney survival compared to deceased donor KTA; including death with function (graft loss HR 0.35; 95% CI 0.21-0.57; p<0.001) and censored for death (graft loss HR 0.45; 95% CI 0.22-0.90; p=0.02). Patient survival was also better among SPK recipients compared to deceased donor KTA (death HR 0.48; 95% CI 0.24-0.95; p=0.03).

Conclusion: Overall, patient and kidney transplant survival has improved over time for SPK and KTA recipients. At 5 years, patient survival is >90%, and kidney transplant survival >80%. The diminishing advantage of SPK over KTA may reflect selection bias compared to earlier years when SPK donors were scarcer and PAK was more common.

FIGURE 7

FIGURE 7

Back to Top | Article Outline

TUMOUR RESECTED KIDNEY GRAFTS FOR TRANSPLANTATION IN WESTERN AUSTRALIA: OUCTOMES OF THE TRK PROGRAM, 10 YEARS ON

APIKOTOA Sharie1, and Bulang HE1,2

1WA Liver & Kidney Transplant Service, Sir Charles Gairdner Hospital, Perth, 2School of Medicine, University of Western Australia, Perth

Introduction: Kidney transplantation is the definitive treatment for end stage renal failure, and chronic organ shortage is a global issue. The Tumour Resected Kidney (TRK) program was implemented in Western Australia, 2007. The aim of this study is to review the outcomes of the TRK transplantation in patients enrolled in the program by the Western Australian Kidney Transplant Service (WAKTS).

Materials and Methods: Data was prospectively collected using a registry of all selected patients receiving TRK transplantation between Feb 2007 and February 2017. Twenty-seven patients received a TRK transplant. Follow up was from 2-10 years with a median of 7 years. Data was analysed regarding patient and graft survival, surgical complications, kidney graft function and tumour recurrence.

Results: There were 27 TRK transplanted in patients with an age range between 32-76 years (average 63 years). The tumour size ranged from 1-4cm (mean 2.7cm) with histopathology confirming renal cell carcinoma (RCC) in 20 kidneys, 1 chromophobe tumour, 3 papillary RCC and 4 benign tumours. Complications included urine leakage in 3 patients, requiring prolonged drainage, 1 non-functional graft, 1 graft loss and 1 pseudoaneurysm formation. The Graft function was satisfactory with the average creatinine at 135 μmol/L. There has been no tumour recurrence during follow-up.

Conclusion: The outcome of transplants by using TRK has shown to be satisfactory. The process of cold perfusion and preservation may help prevent the tumour recurrence. It is an option in the selected recipients under strict criteria.

Back to Top | Article Outline

EPIDEMIOLOGY AND ESTIMATED COST OF COMPLICATED SIMULTANEOUS PANCREAS KIDNEY TRANSPLANTATION

Joshua XU1, HITOS Kerry2,1, HORT Amy2, SHAHRESTANI Sara2, ROBERTSON Paul2,1, YUEN Lawrence2,1, RYAN Brendan2,1, ROO Ronald DE2,1, HAWTHORNE Wayne3,4, and Henry PLEASS2,1

1Westmead Clinical School, University of Sydney, 2Department of Surgery, Westmead Hospital, Sydney, 3Discipline of Surgery, Sydney Medical School, University of Sydney, 4Centre for Transplant and Renal Research, Westmead Institute of Medical Research

Background: Simultaneous pancreas-kidney transplantation (SPK) is a well-established treatment for type 1 diabetes mellitus and end-stage renal disease. Limited detail on cost-effectiveness exists for transplantation and treatment complexity specifically relating to surgery and complications.

Aims: To examine the cost associated with SPK transplantation for in-hospital admissions based on co-morbidities, complications and procedure complexity.

Methods: 234 SPK transplantations were reviewed at Westmead Hospital (2008-2017). Donor and recipient demographic details, co-morbidities, operative characteristics, hospital and ICU length of stay (LOS), enteric leaks and graft thrombosis were collected. Estimated DRG price weights and national weight activity units (NWAU) were used to calculate in-patient costs (Australian dollars (AUD)).

Results: Median donor and recipient age was 26 years (IQR:19-34) and 39 years (IQR:34-44) respectively. Median donor BMI was 24.2 kg/m2 (IQR:21.9-25.6) and 24.2 kg/m2 (IQR:21.7-27.7) for recipients. Median in-hospital LOS was 9 days (IQR:8-12). Overall, 17% of recipients experienced graft thrombosis and 5.2% enteric leaks. Complications such as sepsis, haemodialysis, enteric leaks and ICU admission increased the in-hospital and surgery costs per patient by more than $49,237 AUD. Greater complexity such as an increase in ICU LOS and factors like volume depletion, infection, thrombosis, hypertension, hyperkalemia, osteoporosis, re-operation and asthma increased the additional in-hospital per patient cost by more $153,841 AUD compared to uncomplicated cases.

Conclusions: Treatment complexity, co-morbidities, ICU LOS, enteric leaks, graft thrombosis and re-operation influences in-hospital only costs greatly. This economic impact is further amplified when wages, patient assessments, organ retrieval, pharmaceutical needs, monitoring and long term follow-up costs are added.

Back to Top | Article Outline

LONG-TERM GRAFT SURVIVAL AND FUNCTION IN RECIPIENTS OF DCD COMPARED TO DBD RENAL ALLOGRAFTS: A SINGLE CENTRE REVIEW.

SALTER Sherry1, TAN Sarah1, MULLEY William2,3, CHAMBERLAIN Stacey1, POLKINGHORNE Kevan2,3, SAUNDER Alan1, and KANELLIS John2,3

1Department of Surgery, Monash Medical Centre, Melbourne, 2Department of Nephrology, Monash Medical Centre, Melbourne, 3Centre for Inflammatory Diseases, Department of Medicine, Monash University, Melbourne

Aim: We previously described our short-term outcomes for DCD compared with DBD renal allograft recipients and now sought to extend those comparisons for long-term patient survival, graft survival and graft function between these groups.

Methods: Retrospective cohort study. All patients receiving a renal transplant from a deceased donor at our centre between 1 January 2010 to 30 April 2013 were included. Multi-organ transplant recipients were excluded. Baseline patient characteristics were compared. Graft and patient survival and mean eGFRs were compared between DCD and DBD recipients using the log-rank test and the student t-test respectively.

Results: The group comprised 91 DBD and 39 DCD recipients. The median follow-up was 6.2 years (range 4.7 to 8.0 years). There were no differences in donor or recipient age between groups however there was more delayed graft function in the DCD group. DCD kidney recipients had a longer length of admission (9.5 ± 8.3 days vs 11.6 ± 5.1 days (Rank Sum P<0.01). Mean eGFR was significantly lower in the DCD group until 2 months post-transplant where after there were no differences to 8 years. There were 16 recipient deaths and 5 graft losses during the study period. Patient and graft survival (Figure) were not different between groups.

Figure

Figure

Conclusion: Equivalent outcomes in the longer term can be achieved with DCD and DBD renal allografts with similar donor characteristics. Early differences in the rate of DGF and renal function did not result in differences in graft survival or renal function after the first 2 post-transplant months.

Back to Top | Article Outline

STROKE MORTALITY IN KIDNEEY TRANSPLANT RECIPIENTS: A POPULATION-BASED COHORT STUDY USING DATA LINKAGE

DE LA MATA Nicole1, MASSON Philip2, AL-SHAHI SALMAN Rustam3, KELLY Patrick1, WEBSTER Angela1,4

1Sydney School of Public Health, University of Sydney, 2Department of Renal Medicine, Royal Free London NHS Foundation Trust, 3Centre for Clinical Brain Sciences, University of Edinburgh, 4Centre for Transplant and Renal Research, Westmead Hospital, Sydney

Aims: We aimed to compare stroke deaths in kidney transplant recipients with the general population.

Methods: We established the primary cause of death for incident kidney transplant recipients using data linkage between the Australian and New Zealand Dialysis and Transplant Registry (ANZDATA) and national death registries: Australia, 1980-2013 and New Zealand, 1988-2012. We used indirect standardisation to estimate standardised mortality ratios (SMR) with 95% confidence intervals (CI) and a competing risks regression model to identify risk factors for stroke and non-stroke mortality.

Results: Among 17,621 kidney transplant recipients, there were 158 stroke deaths and 5,126 non-stroke deaths in 160,332 person-years of follow-up. Stroke death rates steadily increased from transplantation. All-cause stroke SMR were higher in people who were younger and particularly in females (Fig. 1). Kidney transplant recipients aged 30-49 had much greater stroke deaths than expected in the general population (Females: SMR 21.3, 95% CI: 13.9-32.7; Males SMR 9.9, 95% CI: 6.2-15.9). A higher risk of stroke death was associated with older age at transplant, earlier year of transplant and prior known cerebrovascular disease.

FIGURE 1

FIGURE 1

Conclusion: Stroke mortality is significantly higher among kidney transplant recipients than in the general population, particularly for young people and females. Cardiovascular risk factor control and acute stroke interventions have reduced stroke mortality in the general population, but their effectiveness and the extent to which they are used in kidney recipients is less clear.

Back to Top | Article Outline

THE ASSOCIATION BETWEEN ETHNICTY, ALLOGRAFT FAILURE AND MORTALITY AFTER KIDNEY TRANSPLANTATION IN INDIGENOUS AND NON-INDIGENOUS AUSTRALIANS: IS THIS EXPLAINED BY ACUTE REJECTION?

HOWSON Prue1, IRISH Ashley2, D'ORSOGNA Lloyd3,4, SWAMINATHAN Ramyasuda2, PERRY Gregory5, DE SANTIS Dianne3, WONG Germaine6,7,8, and LIM Wai H1,4

1Department of Renal Medicine, Sir Charles Gairdner Hospital, Perth, 2Department of Renal Medicine, Fiona Stanley Hospital, Perth 3Department of Immunology, Fiona Stanley Hospital, Perth, 4School of Medicine, University of Western Australia, Perth, 5Department of Renal Medicine, Royal Perth Hospital, 6University of Sydney, 7Centre for Transplant and Renal Research, Westmead Hospital, Sydney, 8Centre for Kidney Research, The Children's Hospital at Westmead, Sydney

Aim: We aimed to determine whether acute rejection (AR) was a mediator between ethnicity (Indigenous/Non-Indigenous), allograft failure and mortality after kidney transplantation and whether ethnicity was a risk factor for allograft failure and mortality in those who had experienced AR.

Materials and Methods: End-stage kidney disease patients who have received a kidney-only transplant between 2000-2010 in Western Australia were included. Cox proportional modelling was used to determine the association between ethnicity, AR, allograft failure and all-cause mortality. Mediation analysis was conducted to determine whether AR was a causal intermediate between ethnicity and outcomes, and propensity-scored analysis was used to examine the association between ethnicity and outcomes in recipients who had experienced AR.

Results and Discussion: Of 618 patients who received a kidney transplant, 59(9.5%) were indigenous. During a median(IQR) patient follow-up of time of 7.9(5.7) years, indigenous recipients were more likely to experience AR (73%vs.42%,p<0.001), allograft failure (66%vs.37%,p<0.001) or death (44%vs.25%,p=0.002) compared to non-indigenous recipients, with adjusted hazard ratios (HR) of 1.86(95%CI 1.28-2.70,p<0.001), 2.17(1.97-4.00,p<0.001) and 2.35(1.49-3.71,p<0.001), respectively. Approximately 29% and 2% of the effects between ethnicity and allograft failure and death, respectively were explained by AR. In the propensity-scored analysis in recipients who had experienced AR (1:1 ratio of indigenous/non-indigenous recipients matched for recipient age, donor type, HLA-mismatches and diabetes), indigenous recipients remained at a higher risk of allograft failure and death with respective adjusted HR of 1.88(1.09-3.25,p=0.023) and 2.18(1.03-4.60,p=0.041).

Conclusions: Acute rejection explained almost 30% of the association between ethnicity and allograft failure. Following rejection, the risk of mortality was 2-fold greater in indigenous compared to non-indigenous recipients. A greater understanding of factors contributing to these adverse outcomes is urgently required.

Back to Top | Article Outline

REVIEW OF THE NEW ZEALAND (NZ) EXPERIENCE WITH DONATION AFTER CIRCULATORY DEATH (DCD) KIDNEY TRANSPLANTATIONS 2008-2016

SUN Tina1, DITTMER Ian2, and MATHESON Philip3

1Department of Renal Medicine, Middlemore Hospital, 2Auckland Renal Transplant Group, Auckland City Hospital, 3Department of Renal Medicine, Wellington Hospital

Aim: Review of the NZ national experience and outcomes with DCD kidney transplantations since its introduction in 2008.

Background: Deceased donor kidney donation in NZ has been exclusively from donation after brain death donors for many years. This changed in 2008 with the introduction of DCD transplantations to increase the availability of deceased donors.

Method: A retrospective review of DCD kidney transplantations performed in NZ between January 2008 and December 2016, with ollow-up until March 2017. Patients were identified from ANZDATA registry and Organ Donation New Zealand database. Data collected were: age, gender, ethnicity, mortality, immediate and long-term graft function, cold ischaemic time, graft number and co-morbidities.

Results: A total of 42 DCD transplantations were conducted in NZ during the study period from 22 donors. The majority of the recipients were male (71%) with a mean age of 50.1 (+/-14.4). 57% of the recipients developed delayed graft function (DGF) requiring renal replacement therapy for a mean duration of 7.25 (+/- 5.7) days after transplantation. There was no primary graft non-function. All-cause graft survival was 90% at 1 year, 86% at 2 years, and 86% at 5 years. Death-censored graft survival was 100% at 1 year, 95% at 2 years, and 95% at 5 years. Mean creatinine were 181umol/L, 140umol/L, 139umol/L, 132umol/L, and 150umol/L at 1 month, 3 months, 6 months, 1 year, and 5 years after transplantation respectively.

Conclusion: DCD kidney transplantations in NZ had favourable long-term graft survival with good renal function despite high DGF in the initial post-transplantation period.

Back to Top | Article Outline

LONG-TERM OUTCOME OF KIDNEY TRANSPLANTATION IN PATIENTS WITH CONGENITAL ANOMALIES OF THE KIDNEY & URINARY TRACT

MCKAY Ashlene1, KIM Siah1,2, and KENNEDY Sean1,2

1Department of Nephrology, Sydney Children's Hospital, 2School of Women's & Children's Health, University of New South Wales, Sydney

Aim: Congenital anomalies of the kidney and urinary tract (CAKUT) are a leading cause of end stage kidney failure in the young. However, there is limited information on long term outcomes after kidney transplantation in this group. We explored the outcomes of kidney transplant in patients with the 3 most common severe forms of CAKUT; posterior urethral valves (PUV), reflux nephropathy and renal hypoplasia/dysplasia.

Methods: Data were extracted from ANZDATA on all first kidney transplants performed between 1976 and 2015 in recipients with a primary diagnosis of PUV, reflux nephropathy or renal dysplasia, who were younger than 30 years when they received their transplant. Using multivariate Cox regression, we compared death censored graft survival between the three groups.

Results: 142 patients with PUV, 272 with renal dysplasia and 938 with reflux nephropathy were included.10-year graft survival in PUV, renal dysplasia and reflux nephropathy was 67%, 72% and 64% respectively and 20-year graft survival was 32%, 51% and 43%.

After adjusting for age at transplant, era of transplantation, graft source and HLA matching, there was no significant difference in graft survival, although there was a trend to poorer outcome in PUV (HR 1.31, 95% CI 0.93 to 1.84).

Conclusions: Graft survival of first transplant in CAKUT is favourable at 10 years. We report a trend towards poorer graft survival for patients with PUV. Larger studies are required to determine whether the risk of graft failure is increased in patients with PUV.

FIGURE 1

FIGURE 1

Back to Top | Article Outline

COMPARISON OF KIDNEY ALLOGRAFT SURVIVAL IN THE EUROTRANSPLANT REGION AFTER CHANGING THE ALLOCATION CRITERIA IN 2010 – A SINGLE CENTER EXPERIENCE

MEHDORN Anne-Sophie1, BECKER Felix1, REUTER Stefan2, SUWELACK Barbara3, SENNINGER Norbert1, VOGEL Thomas1, PALMES Daniel3, and BAHDE Ralf3

1General, Visceral and Transplant Surgery, Universityhospital Muenster, Germany, 2Department of Nephrology, Universityhospital Muenster, Germany, 3Universityhospital Muenster, Germany, Universityhospital Muenster, Germany

In 2010 Eurotransplant introduced the European Senior Program (ESP) aiming to avoid waiting list competition between young and elderly patients suffering from end stage renal disease and thus shorten waiting times for both groups. ESP-donors have to be older than 65 years and grafts are preferably allocated regional in order to shorten cold ischemia time not primarily taking HLA matching into account. This study aims to compare a historic cohort with a collectiv receiving grafts according to new guidelines.

We stratified 159 eligible patients > 65 years (ESP (n=69), former allocation criteria (n=89)) from the transplant center of Muenster, Germany and analyzed patient and graft survival as well as surrogate markers of short- and long term graft function (acute rejection, primary function (PF), delayed graft function (DGF), glomerula filtration rate (GFR).

While donors were comparable in both groups, recipients in the ESP-group were significantly older (69.51 y ± 3.42 vs. 67.06 y ± 2,59, p < 0.05), had significantly shorter time of dialysis (13.64 m ± 20.06 vs. 60.17 m ± 28.06, p < 0.05) and suffered from more comorbidities. Cold and warm ischemia time were significantly reduced in the ESP-group and the latter had more grafts with PF. Longterm graft function was similar. Yet, graft survival was significantly better in the ESP-group. Overall patient survival was comparable after five years.

Patients receiving grafts from older donors according to the new ESP-criteria did not have disadvantages compared to patients receiving grafts according to the former allocation criteria.

Back to Top | Article Outline

DONATION AFTER CIRCULATORY DEATH COMPARED WITH DONATION AFTER BRAIN DEATH: OUTCOMES FOR ISLET TRANSPLANTATION IN AUSTRALIA

HAWTHORNE Wayne1,2, CHEW YiVee2, WILLIAMS Lindy2, HARON Christian2, HITOS Kerry1, MARIANA Lina3, KAY Tom3, O'CONNELL Philip2,4, and LOUDOVARIS Tom3

1Sydney Medical School, University of Sydney, Westmead Hospital, Sydney, 2Centre for Transplant and Renal Research, The Westmead Institute of Medical Research, 3Tom Mandel Islet Transplant Program, St Vincent's Institute, Melbourne, 4Western Clinical School, University of Sydney

Introduction: Islet cell transplantation provides long-term insulin independence treating T1D patient’s severe hypoglycaemic unawareness. Significant lack of organ donors results in patients remaining on the waitlist for years. Donation after Circulatory Death (DCD) donors may be a potential resource that could help solve this shortage.

Materials and Methods: Islet donor pancreata were compared from the Australian National Islet Transplant program with multiple donor and isolation variable outcomes analysed.

Results: A total of 27 DCD and 73 DBD islet donor pancreata were compared with no significant differences seen in donor characteristics between DCD and DBD. Isolation outcomes showed post-purification yield (IEQ) was significantly lower in DCD group (146,518±28,971) compared to DBD (256,986±17,652; P=0.001). Post-purification yield gm/pancreas was significantly lower in DCD (2,154±504 vs. 2,681±372 IEQ/g, P<0.0001). The quality and functionality of DCD and DBD islets were also significantly different in terms of the viability (%) – P=0.017 (higher in DBD than DCD), purity (%) – P=0.001 (higher in DBD than DCD). The proportion of DCD islets transplanted (1/27) was significantly lower than DBD (29/73) going to transplant (OR, 0.1093; 95% CI; P=0.001).

Conclusion: In the Australian setting with vast distances to ship pancreata we have had poorer outcomes from DCD pancreata for islet isolation and have thus far not yielded outcomes comparable to those from our DBD donors. Earlier intervention, the use of ante mortem heparin and faster logistics in transport may not only improve the DCD organs for transplantation but also help alleviate donor shortages allowing treatment of those with T1DM and severe hypoglycaemic unawareness.

Back to Top | Article Outline

IMMUNOSUPPRESSANT PRESCRIBING PRACTICES IN YOUNGER ADULTS COMPARED TO ELDERLY RENAL TRANSPLANT RECIPIENTS ACROSS AUSTRALIA AND NEW ZEALAND

COSSART Amelia1, COTTRELL Neil1, MCSTEA Megan2, ISBEL Nicole3, CAMPBELL Scott3, and STAATZ Christine3

1School of Pharmacy, University of Queensland, Brisbane, 2University of Queensland, Brisbane, 3Department of Nephrology, University of Queensland at the Princess Alexandra Hospital

Background: Kidney transplantation is first-line treatment for patients with end-stage renal failure. Optimising immunosuppressant regimens is crucial; current guidelines make no specific recommendations for elderly patients. Aim

Aim: To evaluate the immunosuppressant medicine prescribing differences of elderly and younger adult renal transplant recipients across Australia and New Zealand.

Methods: A descriptive study of data obtained from the ANZDATA (Australia and New Zealand dialysis and transplant) registry including all patients transplanted from 2000-2015 was conducted. Patients were categorised according to age: younger adults (<70 years) and elderly (>70 years). The types and doses of immunosuppressant medicines prescribed were compared between groups using descriptive statistics (Mann-Whitney test or chi-statistic, as appropriate).

Results: A total of 6,930 patients were included in the analysis; 39% of younger adults and 41% of elderly patients were female, with an average age of 48 and 72 years respectively. The three most commonly prescribed immunosuppressant drugs were prednisolone, mycophenolate and tacrolimus; with 87% of younger adults and 89% of elderly patients taking three immunosuppressant medicines. Initial doses of mycophenolate and tacrolimus were significantly lower in elderly patients (p<0.05), and this trend continued at one-year, with doses of mycophenolate, tacrolimus, cyclosporin A and azathioprine significantly lower in elderly recipients (p<0.05; Figure 1). Elderly patients also had greater median reductions from initial to one-year post transplant in their doses of mycophenolate and azathioprine (p<0.05).

FIGURE 1

FIGURE 1

Conclusions: In our sample, immunosuppressant medicine doses were reduced more in elderly patients. Further investigation of drug levels and patient outcomes in the elderly is warranted.

Back to Top | Article Outline

RENAL TRANSPLANT PATIENT AND GRAFT SURVIVAL UNAFFECTED BY POST-TRANSPLANT DIABETES IN THE ERA OF LOW MAINTENANCE IMMUNOSUPPRESSION

PIMENTEL AL1,2, MASTERSON R2, YATES C3,4, HUGHES P2, and COHNEY S5,6,7

1Graduate Program in Endocrinology, Universidade Federal do Rio Grande do Sul (UFRGS), 2Department of Nephrology, Melbourne Health, 3Department of Diabetes and Endocrinology, Melbourne Health, 4Department of Endocrinology, Western Health, 5Department of Nephrology, Western Health, 6Department of Medicine, University of Melbourne, 7Department of Epidemiology, Monash University, Melbourne

Aims: Preexisting diabetes (PEDM) and newly detected diabetes after transplant (PTDM) have been associated with reduced patient and graft survival. However, outcome data since adoption of lower maintenance immunosuppression is sparse. This study examined outcomes in patients undergoing renal transplantation between December 2004 and 2009 according to diabetes status, with patients receiving prednisolone ≤ 5mg, MMF ≤ 500mg b.d, and tacrolimus ≤ 4ng/ml beyond 12 months.

Methods: All patients transplanted between December 2004-2009 were analyzed using prospectively collected data from an electronic database, patient records and ANZDATA. Diabetes status was determined using HbA1c, blood glucose levels and/or use of glucose lowering therapy.

Results: 534 patients were assessed, 7 receiving more than 1 KT. Mean age 45.2±14.1 years, 64.6% male, 63 PEDM, 86 PTDM (64 diagnosed within 12 months, 22 subsequently). After mean follow-up of 9.2±2.2 years patient survival was 89.9%, 81% and 90.6%, respectively, in NDM, PEDM & PTDM diagnosed within the first year. When considering PTDM diagnosed anytime, patient survival was 87.2%. Mean Tac level at 1 year was 3.7±2.3 ng/mL and <4 ng/mL at 4 years. Graft survival was 70.5%, 69.8% and 73.3%, respectively, in non-DM, PEDM & those with PTDM diagnosed within 12 months, and 76.6 % when considering patients diagnosed with PTDM at any time (Figure 1).

FIGURE 1

FIGURE 1

Conclusions: In this large single centre analysis of renal transplant recipients receiving more contemporary immunosuppression, PTDM had no impact on patient or graft survival, though there was a statistically significant reduction on patient survival in patients with PEDM.

Back to Top | Article Outline

RANGE AND CONSISTENCY OF CARDIOVASCULAR OUTCOMES REPORTED IN CONTEMPORARY RANDOMISED TRIALS IN KIDNEY TRANSPLANT PATIENTS: A SYSTEMATIC REVIEW

VAN Kim Linh1,2, O'LONE Emma1,2, TONG Allison1,2, VIECELLI Andrea3, HOWELL Martin1,2, SAUTENET Benedicte4, MANERA Karine1,2, and CRAIG Jonathan1,2

1Centre for Kidney Research, The Children's Hospital at Westmead, Sydney, 2School of Public Health, University of Sydney, 3University of Queensland, Brisbane, 4University of Tours

Background: Cardiovascular disease (CVD) is the primary cause of death and a major contributor to graft loss in kidney transplant recipients. However, inconsistent reporting of cardiovascular outcomes may limit assessment of the comparative effect of interventions across trials and the use of trial evidence in decision-making.

Aims: To determine the scope and consistency of cardiovascular outcomes reported in contemporary trials in kidney transplant recipients.

Methods: MEDLINE, Embase, the Cochrane Kidney and Transplant Specialized Register , and ClinicalTrials.gov were searched from 2013 and 2017 to identify randomis ed trials and trial protocols reporting any cardiovascular outcome. Definitions, measures and timepoints for all CVD outcomes were extracted and analysed.

Results: From 81 trials, 1097 CVD different measures were extracted and categorised into 37 CVD outcomes. The three most frequently reported outcomes were: cardiovascular composites (35 [43%] trials), all-cause mortality (29 [36%] trials), and acute coronary syndrome (28 [35%] trials). Cardiovascular composites were reported in 33 different combinations of components, with 29 being unique to a single trial.

Conclusions: There is extreme heterogeneity in the reporting of cardiovascular outcomes in trials in kidney transplant patients. CVD composite outcomes vary widely. Establishing a standardized CVD outcome, that is critically important to patients and clinicians, may improve the relevance and use of trials to inform decision-making.

Back to Top | Article Outline

DO INDIGENOUS PATIENTS HAVE BETTER SURVIVAL WITH A KIDNEY TRANSPLANT COMPARED TO STAYING ON DIALYSIS? A PROPENSITY MATCHED STUDY

LAWTON Paul1, CUNNINGHAM Joan1, ZHAO Yuejen2, JOSE Matthew3, and CASS Alan1

1Wellbeing & Preventable Chronic Diseases Division, Menzies School of Health Research, Charles Darwin University, 2Innovation & Research Branch, Department of Health, Northern Territory Government, 3Department of Medicine, University of Tasmania

Aim: Indigenous patients are unlikely to be wait-listed for or receive a kidney transplant. Many clinicians are concerned about Indigenous transplant outcomes. We compared survival for Indigenous transplant patients with similar Indigenous dialysis-only patients, contrasting with non-Indigenous patients.

Methods: Using ANZDATA, all Australians commencing renal replacement therapy from 1st April 1995 were followed until 31st December 2015. Transplant recipients were paired by propensity score with similar dialysis-only patients of the same ethnicity within four time cohorts: time at risk for each pair was taken from the transplant date. All-cause survival was compared using unadjusted and stratified Cox proportional hazards models for three time periods post-transplant (accounting for non-proportional hazards), adjusted for demographic and clinical differences and a transplanted-remoteness interaction term.

Results: Indigenous dialysis-only patients were similar to their transplanted pair at baseline, but paired non-Indigenous patients were less similar. Unadjusted five year survival was better for transplanted patients than their dialysis-only pair for non-Indigenous (p<0.0001) and Indigenous patients (p=0.0005). Adjusted Cox models comparing transplanted with dialysis-only patients showed early (0-0.25 years post-transplant) survival equivalence for both Indigenous and non-Indigenous patients, with improvements in subsequent transplanted survival clearest for all non-Indigenous patients except from very remote areas, and for Indigenous patients in major cities (MC) and inner regional (IR) areas.

Conclusions: Indigenous transplanted patients have similar or better survival to similar dialysis-only patients, with long-term benefit in MC/IR. Relatively fewer apparently suitable Indigenous patients received transplants. These data provide direction for future targeted clinical and health services research.

Figure

Figure

Back to Top | Article Outline

Cells and Tissues/Donor Specific Antibodies

SURVIVAL AND FUNCTION OF HUMAN ADRENAL CELL IN IMMUNOISOLATION DEVICE IN ADRENALECTOMIZED IMMUNODEFICIENT MICE

CATTERALL T1, KRISHNA MURTHY B1, MARIANA L1, KOS C1, SACHITHANANDAN N2, THOMAS H1, LOUDOVARIS T1, and KAY T1

1Immunology & Diabetes, St Vincent's Institute, Melbourne, 2Department of Endocrine and Metabolism, St Vincent's Hospital, Melbourne

Background: Primary adrenal insufficiency (PAI) is caused by failure of the adrenal gland to produce steroid hormones - glucocorticoids and mineralocorticoids - and is a potentially lethal disease. Synthetic steroid hormones have transformed PAI from a lethal condition to a chronic one. However, management of PAI is still challenging for patients and clinicians as the current regimens do not restore or replicate normal cortisol secretion in normal conditions and during illness and stress. Hence with current treatment mortality is not normalized and quality of life is poor.

Aim: To treat PAI patients with adrenocortical cells in immunoisolation devices to restore physiological steroid hormone secretion without needing immunosuppression.

Method: We studied the survival and function of isolated human adrenal cells in vitro and in vivo in NRG - SCID mutated NOD mice. About 300x106 adrenocortical cells/ human adrenal gland are routinely obtained with > 80% cells viable, with survival and function in vitro for more than 14 days. A cohort of 10 immunodeficient mice were implanted with an immunoisolation devices into epididymal or ovarian fat pad to allow vascularisation to establish. After 4 weeks, mice underwent bilateral adrenalectomy and were transplanted with 5 million human adrenocortical cells into the device. One mouse died almost 4 weeks after adrenalectomy and the remaining mice are healthy and are being followed up for >10 weeks, secreting cortisol and responding to stimulation with synthetic ACTH 1-24 (synacthen).

Conclusion: Results indicate survival and function of human adrenal cells in the vascularised encapsulation device.

Back to Top | Article Outline

DEVELOPING PHOSPHOLIPASE A2 RECEPTOR ScFv FOR CAR TREGS FOR THE TREATMENT OF AUTOIMMUNE RENAL DISEASE

KARUNIA J1, WANG YM2, ZHANG GY2, WILARAS A2, BAKHTIAR M2, MCCARTHEY H2, and ALEXANDER SI1

1 Centre for Kidney Research, Children's Hospital at Westmead, 2Centre for Transplant and Renal Research, Westmead Institute for Medical Research 3Children's Hospital at Westmead

Background: Idiopathic membranous nephropathy (IMN) is a leading cause of autoimmune renal disease driven in many cases by the recently described cognate antigen M-type phospholipase A2 receptor (PLA2R) expressed on glomerular epithelium. Chimeric antigen receptors (CAR) T cells use antibody fragments to direct T cells to specific antigens, and have achieved clinical success in cancer. The strategy can be translated to treat idiopathic membranous nephropathy (IMN), an autoimmune condition that involves PLA2R, a target antigen that is exclusively expressed on the podocyte lining of the kidneys.

Aims: In this project, we aim to use PLA2R as a target antigen for treating IMN and design a single chain fragment of variable region (ScFv) to use in PLA2R-CAR-Tregs directed towards this antigen by generating a PLA2R-specific monoclonal antibody against this antigen on human, mouse and rat podocytes.

Method: By using genetic sequence search tools, the PLA2R amino acid sequence across three (3) species of human, mouse and rat, were aligned and compared to generate three common peptide immunogens. Using a conditionally-immortalized podocyte cell line (ciPod) we examined immunhistochemically for M-Type PLA2R expression on human podocytes in vitro as an assay for antibody testing. Mice were immunized with the PLA2R peptides to produce monoclonal antibodies against PLA2R. Hybridomas were established and screened and the hybridoma antibody sequenced for use in making the ScFv for the CAR construct.

Results: We have confirmed human expression of the M-type PLA2R in human podocytes on their cell membrane in vitro by immunohistochemical staining. The anti-PLA2R monoclonal antibody (mAb) has been detected in the mouse sera of immunized mice by Western Blot and ELISA. The mAb hybridoma is being sequenced. The anti-PLA2R mAb from these hybridomas is reactive for the human M-Type PLA2R.

Conclusion: We have developed hybridomas against a podocyte target antigen that is also a disease antigen in membranous nephritis and are developing this as a kidney targeting strategy.

Back to Top | Article Outline

COMPARISON OF PANCREATA AND ISLET PREPARATIONS FROM HUMAN ORGAN DONORS

MARIANA Lina, LOUDOVARIS Thomas, KOS Cameron, PAPAS Evan, SELCK Claudia, CATTERALL Tara, THOMAS Helen and KAY Thomas WH

Immunology & Diabetes, St Vincent's Institute, Melbourne

Background: In the past ten years we have received over 300 pancreata and most were processed into islets for either transplant and/or research. Forty-nine of the donors resulted in transplants (into 26 diabetic recipients, 5 as autotransplants), 49 were diabetic (T1D and T2D), 24 were from non-heart beating donors (DCD) and the remaining were heart beating brain dead (BD) donors. Many factors influence the outcome of isolating islets. Here we compare the characteristics of the donor, pancreas and islet preparations of transplantable isolations with isolations that failed to meet transplant criteria, including diabetic and DCD.

Methods: Islets were isolated based on the Ricordi Method. The Edmonton Score used indicates donor quality and incorporates age, CIT, BMI, cause of death, hospital stay, amylase/lipase, procuring team, medical history pancreas fat content, quality of flush and damage.

Results:

Table

Table

As shown in the table, the quantity of islets/g pancreas in diabetic donors was significantly less than non-diabetic donors both pre-purification and post-purification. Similar results were found with glucose stimulated insulin secretion among the group. T1D pancreata were not only deficient in islet numbers but their pancreas size was significantly smaller compared to the other groups.

While there was no difference in body weight between the two groups, T2D: 86.02±16 kg vs non-diabetic: 85.07±21 kg, the IEQ/kg body weight was significantly different, T2D: 1907±1410 vs non-diabetic: 4344±2577 (t-test p<0.0001).

Conclusion: The quality of donors in the transplant group was significantly higher that the other groups as measured by the Edmonton Score. T2D donors are insulin resistant and have islet function deficiencies. Our data show that fewer islets can be isolated from T2D than non-diabetic donors, further validating their exclusion as an islet transplant donor.

Back to Top | Article Outline

MACHINE LEARNING PREDICTION FOR DE NOVO DONOR SPECIFIC ANTIBODIES (DNDSA) AND GRAFT LOSS IN SIMULTANEOUS KIDNEY PANCREAS TRANSPLANT (SPK) RECIPIENTS

COOREY Craig1,2, SHARMA Ankit1,2, CHAPMAN Jeremy3, CRAIG Jonathan1,2, O'CONNELL Philip3, LIM Wai4, NANKIVELL Brian3, TAVERNITI Anne2, WONG Germaine1,2,3, and YANG Jean5,6

1School of Public Health, University of Sydney, 2Centre for Kidney Research, The Children's Hospital at Westmead, Sydney, 3Centre for Transplant and Renal Research, Westmead Institute for Medical Research, Westmead Hospital, Sydney, 4Department of Renal Medicine, Sir Charles Gairdner Hospital, Perth, 5School of Mathematics and Statistics, University of Sydney, 6Charles Perkins Centre, University of Sydney

Aim: To develop a prediction model for dnDSA and allograft loss based on the location of eplet mismatches in SPK transplant recipients.

Methods: A total of 198 SPK transplant recipients (1990-2017) were assessed using data from ANZDATA registry and National Organ Matching System. Machine learning models (random forests) were used to predict dnDSA and allograft loss based on the location of eplet mismatches. The sites of the three most important eplet mismatches were determined using ‘mean decrease in accuracy’.

Results: The cohort included 111 (56%) males, mean age: 38.5 years (SD 6.9) and median follow up time of 6.6 years (IQR: 3.9,11.0). The most common Class I and II eplet mismatches were at 156RA (35%), 82LR (35%) and 76EN (33%); and 70D (55%), 56PD (54%) and 67I (52%), respectively. A total of 38 (20%) and 56 (32%) recipients developed Class I and II dnDSA and 14 (7%) and 29 (15%) patients experienced kidney and pancreas graft loss. Random forest model with the location of eplet mismatches as features achieved a mean cross-validation error of 47.6% and 49.8% for Class I and II dnDSA, 52.3% and 49.1% for kidney and pancreas allograft losses (Table 1). For dnDSA prediction, the three most important Class I eplet mismatches are present in the HLA A antigens, while for Class II eplet mismatches, only DQB1*03 is implicated.

TABLE 1

TABLE 1

Conclusions: The location of the most important eplet mismatches for prediction differed between dnDSA and allograft loss, but random forest model performance was largely indistinguishable.

Back to Top | Article Outline

HLA EPLET MISMATCH AND DONOR SPECIFIC ANTIBODIES IN KIDNEY TRANSPLANTATION

WAN Susan1,2, ANGEL DE WILDE Sian2, ROSALES Brenda3, CHADBAN Steven1,4, and WYBURN Kate1,2

1Department of Renal Medicine, Royal Prince Alfred Hospital, Sydney, 2Sydney Medical School, University of Sydney, 3School of Public Health, University of Sydney, 4Other, University of Sydney

Background: Eplet mismatch provides higher resolution information than HLA and has potential to better predict alloimmune events, including donor specific antibodies (DSA). However, limited prospective data exists on the association between eplet mismatch and DSA.

Aim: To determine the relationship between eplet mismatch and DSA.

Methods: We characterised the number of HLA-A, B, C, DR and DQ eplet mismatches in kidney transplant recipients from 2010-2017 using HLA Matchmaker. Molecular HLA typing was converted from low-resolution (2-digit) to high-resolution (4-digit) using HLA Matchmaker Converter where necessary. All patients were prospectively screened for DSA at 0, 3 and 12-months post-transplant. Associations between eplet mismatches, pretransplant DSA (preDSA), denovo DSA (dnDSA) and clinical outcomes were assessed using multivariable analysis.

Results: Of 313 recipients, high-resolution HLA conversion was not possible for 147 (47%) due to the absence of ethnicity (n=83) or haplotype (n=64) data in the Converter database. Eplet mismatch determination was therefore possible for 166 donor-recipient pairs, of whom DSA screening was complete for 150. The mean number of Class I and II eplet mismatches was 14(±7.7) and 17(±11.8) respectively (Table 1). DSA was detected in 111 recipients (74%); 64 (43%) had preDSA, 30 (20%) had dnDSA, and 17 (11%) had both. The number of eplet mismatches was associated with preDSA (OR 1.04; 95% CI 1.01-1.07; P=0.007), but not with dnDSA or acute rejection.

TABLE 1

TABLE 1

Conclusion: Calculated eplet mismatches were not predictive of dnDSA development or acute rejection, raising doubt about the utility of HLA Matchmaker based eplet matching to predict post-transplant alloimmune events.

Back to Top | Article Outline

DONOR SPECIFIC ANTIBODIES AND CLINICAL OUTCOMES IN KIDNEY TRANSPLANT RECIPIENTS

WAN Susan1,2, CHADBAN Steven1,2, and WYBURN Kate1,3

1Department of Renal Medicine, Royal Prince Alfred Hospital, Sydney, 2Sydney Medical School, University of Sydney, 3Other, University of Sydney

Background: Donor specific antibodies (DSA) are implicated in acute rejection (AR) and graft dysfunction in kidney transplant recipients (KTx). However, limited data exists on their natural history post-transplantation.

Aims: To describe the natural history of pre-transplant (preDSA) and denovo DSA (dnDSA) in KTx.

Methods: We performed a prospective single-centre cohort study in KTx. Patients were screened for DSA at 0, 3 and 12-months post-transplant, and associations between DSA and outcomes were assessed.

Results: 363 KTx between 2010-2017 underwent pre- and post-transplant DSA screening. 136(37%) had preDSA at transplantation; 86(63%) had ClassI and 89(65%) had ClassII. The median MFI of the dominant preDSA was 1230 (IQR 746-2528) at transplantation and declined rapidly, becoming undetectable by 1-month post-transplant. DnDSA were detected in 62(17%) recipients; 28(45%) had ClassI and 45(73%) had ClassII. The median time to first detection of dnDSA was 58 days (IQR 15-267) and the median MFI of the dominant dnDSA was 1150 (IQR 707-2694) at first detection. The MFI of the dominant dnDSA increased over time reaching a median of 14,011 (IQR 931-20,626) at 2 years (Figure 1). 58(26%) of 220 patients with ≥2-years follow-up developed AR; 42(19%) had cell-mediated rejection, 20(9%) had antibody-mediated rejection. The development of dnDSA was strongly associated with AR (OR 4.48; 95%CI 2.14-9.36; P<0.001) but not with eGFR or graft-survival.

FIGURE 1

FIGURE 1

Conclusion: PreDSA were present in 37% of KTx and were significantly reduced by 1-month post-transplant. dnDSA were detected in 17% and increased in intensity over time. The development of dnDSA was strongly associated with AR.

Back to Top | Article Outline

RISK STRATIFICATION FOR REJECTION BY EPLET MISMATCH AFTER EARLY MYCOPHENOLATE DOSE REDUCTION IN KIDNEY TRANSPLANT RECIPIENTS

COUGHLAN Timothy1, CANTWELL Linda2, and LEE Darren1

1Department of Renal Medicine, Eastern Health, 2Victorian Transplantation and Immunogenetics Service, Australian Red Cross Blood Service

Aims: Eplet mismatch (EpMM) is associated with de novo donor-specific antibodies and long-term graft loss in kidney transplant recipients (KTR) especially with poor adherence and low tacrolimus levels. We investigated whether EpMM predicts rejection post-mycophenolate dose reduction within the first year.

Methods: Data on KTR receiving tacrolimus, mycophenolate and prednisolone in a single centre (February 2011 – January 2017) was retrospectively analysed, excluding those with rejection within the first month. We explored the association of conventional HLA mismatches, EpMM (HLAMatchmaker, antibody verified and unverified) and mycophenolate dosing with acute rejection in those with early mycophenolate reduction (<1.5g/d within the first year).

Results: Of the 63 eligible patients (median follow-up: 3.1 years, 12 months minimum), 44 had early mycophenolate reduction, mostly due to cytopaenia (68%). There was no difference in rejection rates with or without early dose reduction (27% vs 28%). Within the dose-reduced cohort, there was no significant difference in conventional HLA mismatches (4.3±1.8 vs 3.3±1.9, p=0.12), or total (51±27 vs 55±35, p=0.73), class I (17±8 vs 15±8, p=0.35), class II (34±22 vs 30±29, p=0.51) or HLA-DQ (16±15 vs 21±18, p=0.74) EpMM loads between rejectors and non-rejectors. No difference in rejection rates was observed between those with >17 (n=22) vs ≤17 (n=22) HLA-DQ EpMM (32% vs 23%, p=0.73). There was also no significant difference in the duration of mycophenolate dosing <1.5g/d and <1.0g/d, or the nadir dose between rejectors and non-rejectors.

Conclusions: EpMM was not associated with acute rejection in KTR with early mycophenolate dose reduction within the first year.

Back to Top | Article Outline

IN VIVO DEPLETION OF REACTIVE DONOR HUMAN CELLS REDUCES THE DEVELOPMENT OF GRAFT-VERSUS-HOST DISEASE IN A HUMANISED MOUSE MODEL

ADHIKARY Sam, GERAGHTY Nicholas, SLUYTER Ronald, and WATSON Debbie

Illawarra Health and Medical Research Institute, University of Wollongong

Graft-versus-host disease (GVHD) is a common life threatening consequence following allogeneic donor bone marrow transplantation. Reactive donor cells are the main effectors of GVHD, and depletion of these cells reduces GVHD severity in allogeneic mouse models, but there is limited data in the humanised mouse models.

Aim: This study aimed to investigate the effect of depleting reactive donor human cells on the development of GVHD in a humanised mouse model.

Methods: NOD-SCID-IL2Rγnull (NSG) mice were injected (i.p.) with 20x106 human (h) peripheral blood mononuclear cells (PBMCs) to induce GVHD, and subsequently injected with post-transplant cyclophosphamide (PTCy) (33mg/kg), or saline on days 3 and 4 post-hPBMC injection. Mice were monitored for GVHD development for 10 weeks, with human cell engraftment examined at 3 weeks post-hPBMC injection, and at end-point by flow cytometry.

Results: PTCy did not affect the engraftment of human cells in NSG mice at 3 weeks post-hPBMC injection. PTCy lowered the development of GVHD in humanised mice, significantly reducing weight loss (P=0.0447) and GVHD clinical score (P=0.0478), and increasing survival (MST=52 days) compared to saline-injected mice (MST=28 days) (P=0.0004). Additionally, PTCy significantly increased the proportion of hCD4+ T cells, and significantly lowered the proportions of hCD8+ T cells and hCD4+hCD25+hCD127lo regulatory T cells. Finally, PTCy did not effect relative expression of pro-inflammatory cytokines, including hIFN-γ and hIL-17, in the liver, spleen or small-intestine of humanised mice.

Conclusion: Depletion of reactive human cells reduces GVHD development in this humanised mouse model, supporting its use in future studies investigating depletion strategies against GVHD.

Back to Top | Article Outline

VALIDATION OF THE ONE LAMBDA FLOWDSA™ ASSAY FOR LIVING DONOR TRANSPLANT WORKUP

BAZELY Scott, TASSONE Gabriella, D'ORSOGNA Lloyd, MARTINEZ Patricia and DE SANTIS Dianne

Clinical Immunology, PathWest, Fiona Stanley Hospital, Perth

Introduction: Routine flow cytometric crossmatches (FCXM) detect donor specific HLA IgG-alloantibodies in transplant recipients. One constraint of FCXMs is false positives caused by non-HLA-mediated cytotoxicity caused by autoantibodies, non-HLA antibodies and immune complexes, and other interfering factors including treatment regimens in desensitisation protocols (e.g. rituximab), that do not reflect patient transplant outcomes. The new One Lambda FlowDSA™ assay specifically labels recipient IgG-alloantibodies bound on the donor cell surface thereby distinguishing alloantibodies from autoantibodies. In this kit, HLA molecules are separated into three groups: Class I, Class IIa (DQ), and Class IIb (DR, DP). The aim of this validation was to determine whether this assay could overcome the limitations of the current flow crossmatch assays. The validation of FlowDSA™ assay required compensation to correct for spectral overlap in fluorochromes, and the establishment of cut-offs to define the crossmatch interpretation.

Methods: FlowDSA™ compensation occurred using a bead-only control and an HLA positive control to correct PE and PerCP spectral overlap and separate the Class I, Class IIa and Class IIb populations. Positive serum with known donor specific antibodies (DSA) and negative serum was evaluated using the FlowDSA™ assay and compared results obtained with current methods.

Results: All three HLA molecule groups were distinguished as separate populations. The FlowDSA™ assay reported a positive crossmatch in the presence of strong DSA and a negative crossmatch in the absence of DSA.

Conclusions: The FlowDSA™ assay appears to be an alternative to current flow crossmatch methods. Suitable positive and negative crossmatch cut-offs, the ability to detect weak DSA and the rate of false positives due to interference from non-HLA factors including Rituximab will be important to determine its suitability for routine clinical use.

Back to Top | Article Outline

IN VITRO SCREENING OF GENES ASSOCIATED WITH KIDNEY FIBROSIS

MA Xiaoqian1,2, SUN Lei1, LU CAO1,2, YI Shounan1, and O'CONNELL Philip1

1Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 2Institute for cell transplantation and gene therapy, The Third Xiangya Hospital of Central South University

Aims: Chronic injury in kidney transplants remains a major cause of allograft loss. Our GoCAR multicenter study has identified a set of 13 genes were independently predictive for the development of fibrosis at 1 year after kidney transplantation. The high predictive capacity of the gene set was superior to clinical indicators. The aim of this study was to identify one or few of these genes which are associated with pathogenesis of kidney fibrosis.

Methods: The murine C1.1 tubular epithelial cell line, FOXO−/− C1.1 and TCF−/− C1.1 cell line were treated with or without TGF-βfor 48h. Then the cells were harvest for real-time PCR to detect the expression of the 13 genes.

Results: We found there were big changes for four genes’ expression. FJX1 and KLHL13 were low expressed in C1.1 and FOXO−/− but upregulated when cells treated with TGF-β. Especially in FOXO−/−, TGF-βinduced more than 10 times expression of them which suggested FJX1 and KLHL13 may play important role in profibrotic effect. The expression of CHCHD10 was opposite with FJX1 and KLHL13 in FOXO−/− and TCF−/−. It was downregulated in FOXO while upregulated in TCF−/− when treated with TGF-β. The expression of ASB15 was almost undetectable in C1.1 and FOXO−/− no matter with or without TGF-β but more than 100 folds in TCF−/−.

Conclusions: The results suggested the four genes may be involved with the signaling pathway of fibrosis and we will further confirm their function by CRISP/CAS9 technique.

Back to Top | Article Outline

IMMUNE PHENOTYPE BY FLOW CYTOMETRY OF PEDIATRIC KIDNEY TRANSPLANT RECIPIENTS AND HEALTHY ADULT CONTROLS

JIMENEZ-VERA Elvira1, ZHAO YUANFEI1, HU Min1, CHEW Yi Vee1, BURNS Heather1, ANDERSON Patricia2, WILLIAMS Lindy1, DERVISH Suat3, WANG Xin Maggie3, YI Shounan1, HAWTHORNE Wayne1, ALEXANDER Stephen4, and O'CONNELL Philip1

1Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 2Centre for Transplant and Renal Research, Westmead Hospital, Sydney, 3Flow Cytometry Core Facility, Westmead Millennium Institute, Westmead Hospital, Sydney, 4Centre for Kidney Research, The Children's Hospital at Westmead, Sydney

Aim: To determine the differences in the immunophenotype of pediatric kidney transplant recipients and healthy adult controls.

Methods: Seven leukocyte-profiling panels containing 8–10 marker for each- were used to monitor the immune profiles of 9 paediatric kidney transplant and 8 adult control samples. Whole-blood (1.5 ml) samples were stained and acquired on BD-LSRFortessa and Flowjo was used for data analysis.

Results: Differences in subpopulations between pediatric and healthy adult control can be seen in Table 1. This showed a significant increase in absolute numbers of Granulocytes, CD14+Monocytes, double negative NKT cells and CD56+HICD16+ Intermediate NK cells. B cell panel showed a significant increase in naïve B cells, IgD+IgM+ B cells, and IgM+CD27- B cells. The naïve CD4+ and naïve CD8+ T cells, and naïve Tregs which were higher than in adult controls. Moreover, naïve Foxp3 Tregs in pediatric transplanted patients had higher CD25. Memory CD4+ T cells in pediatric patients has a similar HLA-DR expression. Further we found a significant decrease in the following cell populations in our pediatric kidney transplant patients: Non-classical monocytes, CD8+NKT, memory B cells, IgD-IgM- B cells, CD27+CD38low Class-switched memory, CD27-CD28+, CD25+ of CD4+, CD57+CD27-CD28+ and CD25+ of CD8, CCR7-/CD62L-CD45RA- of CD4 and CD8, CXCR3+CD45RO+ of CD4 and of CD8, and CD127+CD45RO+ and CD25+CD45RO+ on CD4 T cells (effector Foxp3 Tregs).

TABLE 1

TABLE 1

Conclusion: Immune profiling of pediatric transplant recipients demonstrated more naïve T cells, B cells and Tregs and less memory and effector memory T cells compared to healthy adult controls.

Back to Top | Article Outline

COMPARISION OF 3 LYMPHOCYTE SEPARATION METHODS FOR FLOW CROSSMATCH ASSAY

TASSONE Gabriella, BAZLEY Scott, D'ORSOGNA Lloyd, MARTINEZ Patricia and DE SANTIS Dianne

Clinical Immunology Fiona Stanley Hospital, Pathwest

Introduction: The Stem Cell EasySep Direct Total Human Lymphocyte Isolation kit™ (DTHLI) uses immunomagnetic beads technology to bind non-lymphocytes within the sample. The beads are then removed by a magnet, while the supernatant contains the lymphocytes. The Stem Cell SepMate gradient centrifugation tubes™ (SepMate) use a plastic insert to keep the Ficoll at the base of the tube for spinning and pouring off the lymphocyte layer.

Method: The SepMate, DTHLI, and the current routine Ficoll gradient isolation methods were compared. The isolation time, cell yield and suitability for the routine flow crossmatch assay (FCXM) were assessed.

Results: The SepMate and DTHLI methods were more rapid, 30 and 45 minutes respectively, compared to 3 hours for the current method. The cell yield obtained by the SepMate and the current method were sufficient to perform the FCXM on untreated and pronase treated serum, while the DTHLI provided sufficient cells to perform the FCXM only on the pronase treated serum. Despite the lower cell yield the purity of CD3 and CD19 was superior in the DTHLI isolation method compared to the other methods. The lymphocyte preparations were then evaluated using the FCXM using a serum known to have donor specific antibody (DSA) to donor mismatches and a negative serum.

Conclusion: The results indicated that both SepMate and DTHLI were more rapid than the current method. The cell yield of SepMate was comparable to the current method however the cell yield obtained from the DTHLI was lower than both the current method and SepMate. However, the DTHLI isolated a greater proportion of CD3 and CD19 positive cells and therefore the total number of lymphocytes required to perform FCXM may be less than currently required. All three methods produced comparable flow crossmatch results.

Back to Top | Article Outline

Surgical

ENHANCED RECOVERY AFTER SURGERY AND THE RENAL TRANSPLANT RECIPIENT – USEFUL OR A WASTE OF TIME?

LAMBERT Virginia, CHANDRA Abhilash, RUSSELL Christine, OLAKKENGIL Santosh and BHATTACHARJYA Shantanu

Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital

Enhanced Recovery After Surgery (ERAS) pathways are an accepted part of modern surgical practice. In renal transplant recipients, however, there is no clear consensus regarding their utility.

The renal transplant unit at the Royal Adelaide Hospital introduced an ERAS protocol for the perioperative management of transplant recipients from June 2017. We present the outcomes of 39 consecutive cases.

Patient and Methods: All patients who had a renal transplant from introduction of the protocol until the time of writing were enrolled in the ERAS protocol. The protocol included pre-operative weight optimization on dialysis; perioperative carbohydrate loading; goal-directed fluid management prior to reperfusion; fluid balance, aiming to achieve a net weight gain ≤ 3kg in the first 24 hours; opiate avoidance and use of regional wound infusers.

There were nine recipients from a live donor; 23 from donation after brainstem death (DBD) and 7 from donation after circulatory death (DCD).

Results: The mean weight gain on post- operative day 1 was 3.06Kg.

None of the live donor recipients had delayed graft function (DGF) requiring dialysis. DGF was observed in 28.6% of the DCD graft recipients and 39.1% of the DBD graft recipients.

Mean length of stay was 4.6 days.

Conclusion: Our experience challenges the widespread practice of fluid loading post renal transplant. Our re-admission rate has not increased and early results suggest that there are significant gains to be made via reduced length of hospital stay.

Back to Top | Article Outline

TRANSITION FROM LAPAROSCOPY TO RETROPERITONEOSCOPY FOR LIVE DONOR NEPHRECTOMY - A CASE CONTROL STUDY

NG Zi Qin1, REA Alethea2, and HE Bulang1,3

1WA Liver & Kidney Transplant Service, Sir Charles Gairdner Hospital, Perth, 2Centre for Applied Statistics, University of Western Australia, Perth, 3Department of Surgery, University of Western Australia, Perth

Aims: Transperitoneal (TLDN) approach for laparoscopic donor nephrectomy (LDN) is widely adopted in most centres. However, a systematic review has shown that retroperitoneoscopic (RLDN) approach is associated with fewer complications due to the anatomical advantage by avoidance of manipulation of the intraperitoneal organs. The aims of this study were to compare the outcomes of RLDN and TLDN by a case control study. The learning curve for transition from TLDN to RLDN was analyzed.

Methods: A retrospective analysis of all LDNs from 2010 to Oct 2017 were performed. Data on demographics, peri-operative parameters, analgesia consumption, pain scores and kidney graft function were collected and analyzed. A CUSUM analysis was performed to explore the learning curve of RLDN by setting the mean operative goal time of TLDN as a target.

Results: All these 122 donor nephrectomies (60 TLDN and 62 RLDN) were successful with no conversion to open surgery. There was no blood transfusion, readmission or mortality. There were no post-operative complications which were graded over Clavien II. The kidney graft function was comparable in both groups. The follow-up period ranged from 4 to 90 months. The CUSUM analysis demonstrated that approximately 30 cases are required for the surgeon to be proficient in the transition from TLDN to RLDN.

Conclusions: RLDN is a safe approach with comparable results to TLDN. It avoids manipulating the intraperitoneal organs and retains a virgin abdomen and hence reduces peri-operative complication risk. The learning curve of transitioning from TLDN to RLDN is acceptable.

Back to Top | Article Outline

FAVOURABLE CARDIAC REMODELING AND FUNCTIONAL CARDIAC BENEFITS ASSESSED WITH CARDIAC MAGNETIC RESONANCE IMAGING FOLLOWING LIGATION OF ARTERIOVENOUS FISTULA IN STABLE RENAL TRANSPLANT RECIPIENTS: A RANDOMIZED, CONTROLLED, OPEN LABEL STUDY

RAO Nitesh1,2, MCDONALD Stephen3, WORTHLEY Matthew4, and COATES Patrick Toby5

1Nephrology and Renal Transplant, Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital, 2Lyell McEwin Hospital, 3Department of Nephrology, Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital, 4Department of Cardiology, Royal Adelaide Hospital, 5Centre for Transplant and Renal Research, Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital

Aim: To study the change in left ventricular mass(LVM) following ligation of arteriovenous fistula(AVF) in stable renal transplant recipients(RTR), utilizing cardiac MRI(CMR).

Methods: In this randomized controlled trial, we recruited participants aged 18 years or older with stable kidney function twelve months post kidney transplantation, and a functioning AVF, from a tertiary network of renal transplantation service in Australia. Participants were randomly assigned (1:1) to have AVF ligated or not, with all participants undergoing a baseline CMR followed by repeat scan six months later. The primary outcome was change in LVM at 6 months, analyzed according to intention-to-treat principles.

Results: We enrolled 93 participants. 63 eligible participants underwent randomization. 54 out of 63 participants completed assessments after second CMR. The mean LVM decreased by 22 gms (14.7%) [151.2 + 36.5 gm vs. 129.1 + 32.4 gm, p <0.001 in the intervention group vs 153.4 + 47.8 gm vs. 154.6 + 43.0 gm, p= 0.69 in the non-intervention group]. Significant improvements were also noted in end-diastolic volumes, end-systolic volumes, and stroke volumes of both left and right ventricles. There was also an improvement in the atrial volumes. No significant complications were noted after AVF ligation.

Conclusion: In this randomized controlled trial for adults with stable kidney transplantation and functioning AVF, elective ligation of AVF is associated with a 14.7% decrease in LVM as assessed by CMR. This was also associated with improvements in other cardiac parameters.

FIGURE 1

FIGURE 1

Back to Top | Article Outline

PROPHYLACTIC DRAIN INSERTION IN RENAL TRANSPLANTATION: SURGEON PREFERENCE ACROSS AUSTRALIA AND NEW ZEALAND

MUGINO Miho1, LAM Susanna1, LAURENCE Yuen2, VERRAN Deborah1, ALLEN Richard3, PLEASS Henry3, and LAURENCE Jerome1

1General, Visceral and Transplant Surgery, Royal Prince Alfred Hospital, Sydney, 2General, Visceral and Transplant Surgery, Westmead Hospital, Sydney, 3General, Visceral and Transplant Surgery, University of Sydney

Aims: There are no guidelines available concerning the use of prophylactic drain at the conclusion of renal transplantation (RT) in order to prevent post-operative complications such as lymphoceles. We aim to provide a summary of practice amongst renal transplant surgeons across Australia and New Zealand (ANZ).

Methods: An online survey for surgeons who routinely conduct RT across ANZ transplant centres was conducted to study respondents’ demographic information, surgical experience, preference regarding prophylactic drain insertion and their post-operative practice.

Results: 43 out of 66 identified surgeons completed the survey. 41.9% were general surgeons with subspecialisation in transplantation (18.6%) and hepatobiliary surgery (18.6%); 37.2% were vascular surgeons; 13.9% were urologists and 7% were transplantation and dialysis access surgeons.

60.5% of surgeons reported that they insert perigraft drain routinely whereas 20.9% seldom insert drains. The most common reason (58.1%) for drain insertion was “routine practice”. 30.2% of respondents were “uncertain about benefit of drain use” whereas 48.8% felt that this reduced symptomatic peritransplant fluid.

44.2% of respondents consider both volume and time as important factors for drain removal with less emphasis on the fluid composition. Mean post-operative day for drain removal was at 4.56 days. Some surgeons test drain creatinine to exclude urine leak (16.3%). 74.4% of surgeons would consider enrolling their patients for RCT to determine benefit of drain insertion.

Conclusion: There is a wide range of practices amongst RT surgeons. Individual surgeons’ experience appears to be the greatest factor in decision making.

Back to Top | Article Outline

AORTIC VERSUS DUAL PERFUSION FOR RETRIEVAL OF THE DBD LIVER – AN ANALYSIS OF RECIPIENT OUTCOMES USING THE ANZ LIVER TRANSPLANT REGISTRY

HAMEED Ahmer1,2, PANG Tony2,3, YOON Peter2, BALDERSON Glenda4, RONALD De Roo2, YUEN Lawrence2,3, LAURENCE Jerome5,3, LAM Vincent2,3, CRAWFORD Michael6, HAWTHORNE Wayne1,2, and PLEASS Henry2,3

1Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 2Department of Surgery, Westmead Hospital, Sydney, 3School of Medicine, University of Sydney, 4Australia and New Zealand Liver Transplant Registry, Princess Alexandra Hospital, Brisbane, 5Institute for Academic Surgery, Royal Prince Alfred Hospital, Sydney, 6Department of Surgery, Royal Prince Alfred Hospital, Sydney

Aims: To compare the impact of aortic-only and dual (aorta and portal vein) perfusion on donation after brain death (DBD) liver transplantation outcomes.

Methods: DBD liver transplants performed in Australia (2007–16) were included in analyses, and stratified by aortic or dual perfusion routes. The ANZLTR, ANZOD, and a national survey of senior donor surgeons were used to obtain all data-points. Only livers preserved in University of Wisconsin solution were included; patients receiving a subsequent liver transplant, or a reduced size graft were excluded. Graft and patient survival were compared using Kaplan-Meier curves and Cox proportional hazards. Causes of graft loss, including primary non-function, hepatic artery and portal vein thrombosis, biliary complications, and acute rejection, were compared using logistic regression.

Results: Aortic-only perfusion was utilized in 957 cases, compared to 425 dual-perfused livers. The dual-perfused group had a lower mean cold ischaemia time, secondary warm ischaemic time, and MELD score (p < 0.001). Actuarial 5-year graft and patient survivals in the aortic-only and dual-perfused cohorts were 80.1% vs 84.6% (p = 0.066), and 82.6% vs 87.8% (p = 0.026), respectively. After accounting for all confounders, graft (HR 0.81, 95% CI 0.60-1.11, p = 0.188) and patient (HR 0.74, 95% CI 0.52-1.05, p = 0.087) survival were not significantly different between both cohorts. There were no differences between both groups with respect to causes of graft loss. Subgroup analyses are being conducted to compare high-risk donors.

Conclusions: The retrieval technique employed does not impact outcomes when all DBD donors are considered together.

Back to Top | Article Outline

USE OF AN ICE BAG TO MINIMIZE THE PERIOD OF SECOND WARM ISCHAEMIC TIME DURING KIDNEY & PANCREAS TRANSPLANTATION – OUR INITIAL EXPERIENCE

YOON Peter1, HAMEED Ahmer1,2, NGUYEN Hien1,3, GASPI Renan4, HAWTHORNE Wayne5, PLEASS Henry1, and YUEN Lawrence1

1Department of Surgery, Westmead Hospital, Sydney, 2Centre for Transplant and Renal Research, The Westmead Institute for Medical Research, 3Department of Urology and Kidney Transplant, Cho Ray Hospital, Vietnam, 4Department of Renal Medicine, Westmead Hospital, Sydney, 5Centre for Transplant and Renal Research, Other

Aims: To assess the safety and feasibility of performing anastomoses for kidney and/or pancreas transplantation after organ immersion in a bag of ice slush.

Methods: Kidneys alone (n = 4) and/or the kidney & pancreas (n = 1) were retrieved from deceased donors and transported to our center using standard cold static storage. After back-table preparation of the graft, each organ was immersed in ice slush within a sterile bowel bag. The bag was sealed superiorly, and a small perforation was made to allow vessel extrusion. During anastomoses, ice slush was replenished as required; anastomoses were performed in a standard manner to the iliac vessels. The bag was removed prior to reperfusion. (Representative video will be shown during presentation).

Results: All transplants were completed safely, without any visual obstruction during anastomoses. Whilst mean anastomotic time for kidneys and pancreas was 49 ± 8 mins and 32 mins, respectively, the second warm ischaemic time for all organs was <1 minute. There were two cases of delayed graft function, both in DCD kidneys (KDPI 98 & 38). One-month creatinine in these recipients was 214 and 134 μmol/L, respectively. Both DBD kidney recipients (KDPI 69 & 74) had immediate graft function. The kidney/pancreas recipient also had immediate graft function, a one-month creatinine of 60 μmol/L, and was off all insulin.

Conclusions: Kidney/pancreas placement in an ice bag is a convenient, simple, and non-obstructive means of minimizing the secondary warm ischaemic insult. A planned RCT will formally test its efficacy.

Figure

Figure

Back to Top | Article Outline

EVALUATION OF RISK FACTORS FOR ENTERIC LEAKS FOLLOWING SIMULTANEOUS PANCREAS AND KIDNEY TRANSPLANTATION

HORT Amy1, SHAHRESTANI Sara1, HITOS Kerry1, ROBERTSON Paul1, LAM Vincent1, YUEN Lawrence1, RYAN Brendan1, DE ROO Ronald2, HAWTHORNE Wayne J2,3,4, and PLEASS Henry1

1Westmead Hospital, Sydney, 2Department of Surgery, Westmead Hospital, Sydney, 3Discipline of Surgery, Sydney Medical School, University of Sydney, Sydney, Australia, University of Sydney, 4Centre for Transplant and Renal Research, Westmead Institute of Medical Research, Westmead Hospital, Westmead, Australia, Westmead Hospital, Sydney

Introduction: Simultaneous pancreas-kidney transplantation (SPK) is the gold standard treatment for patients suffering Type I Diabetes Mellitus and End-Stage Renal Failure. Enteric drainage is utilised to handle the exocrine drainage, however, enteric leaks (ELs) are one of its more specific and challenging complications. There remains a lack of published research regarding risk factors for ELs, particularly associated with vascular disease.

Methods: SPK transplants performed at Westmead Hospital over ten years (between 2008–2017, n = 234) were analysed to identify ELs. Donor, patient and transplantation procedure risk factors for ELs were collected and analysed. Adjusting for possible confounders, a multivariate logistic regression model was used to assess the risk and predictors of ELs.

Results and Discussion: Of the 234 patients, 12 (5%) experienced an EL. Of these recipients, 9 (75%) had vascular disease, 6 (50%) were ex-smokers, 1 (8%) a current smoker and 3 (25%) were obese with a BMI >30kg/m2. The risk of EL increased by as much as 4.4 fold in recipients with vascular disease (OR: 4.4; 95% CI: 0.80-24.21; P=0.088). Other factors such as recipient BMI >24.2kg/m2 increased the risk of EL by as much as 1.8 fold (OR: 1.8; 95% CI: 0.4-9.3; P=0.46).

Conclusions: We have a possible trend between vascular disease and ELs. These findings also identify other possible risk factors for ELs and the need for further research in this area including careful screening of recipients for vascular disease.

Back to Top | Article Outline

Xenotransplantation

GENETICALLY MODIFIED PORCINE NEONATAL ISLET XENOGRAFTS PROVIDE LONG-TERM FUNCTION IN BABOONS

HAWTHORNE Wayne1,2, CHEW YiVee3, BURNS Heather3, SALVARIS Evelyn4, HAWKES Joanne3, BRADY Jamie5, BARLOW Helen4, YI Shounan3, HU Min3, LEW Andrew5, O'CONNELL Philip3,6, NOTTLE Mark7, and COWAN Peter4,8

1Discipline of Surgery, Sydney Medical School, University of Sydney, 2Centre for Transplant and Renal Research, The Westmead Institute of Medical Research, 3Centre for Transplant and Renal Research, The Westmead Institute for Medical Research, 4Immunology Research Centre, St Vincent's Hospital, Melbourne, 5Walter and Eliza Hall Institute of Medical Research, Melbourne, 6Discipline of Medicine, University of Sydney, 7Department of Obstetrics and Gynaecology, University of Adelaide, 8Department of Medicine, University of Melbourne

Introduction: Alternative strategies such as Xenotransplantation show great promise to be able to provide the organs required to treat diseases such as Type 1 Diabetes.

Aims: To achieve long-term normoglycemia in diabetic baboons transplanted with neonatal pig islets, and to investigate the effect of ceasing immunosuppression.

Materials and Methods: Five diabetic baboons received transplantation of NICC (10,000-50,000 IEQ/kg) from GTKO/CD55-CD59-HT piglets. From day −3 recipients were treated with anti-CD2 induction and maintenance with oral tacrolimus, anti-CD154 and belatacept, which were progressively ceased. Graft survival and function followed by daily blood sugar levels (BSL), IVGTT, OGTT and immunohistochemical analysis of liver biopsies taken at various time points.

Results: No baboon exhibited signs of thrombosis associated with IBMIR. Recipients developed normal fasting BSL and normal IVGTT and OGTT, with porcine insulin and C-peptide secreted in response to glucose stimulus. All animals have become normoglycaemic off all exogenous insulin. Liver biopsies reveal strong positive staining for insulin, glucagon and somatostatin in xenografts. One recipient receiving 50,000 IEQ/kg was insulin-independent for >7 months, including 7 weeks after the last drug (belatacept) was ceased. A second recipient receiving 10,000 IEQ/kg remained insulin independent >18 months, including 6-months off all immunosuppression. The fourth and fifth animals continue to be followed past 6-months and 4-months post transplant.

Conclusion: We have demonstrated for the first time long-term survival and function of porcine islets in baboons. The costimulation blockade-based immunosuppression permitted maturation of the islets such that the dose required to achieve normoglycemia is equivalent to the clinical setting 10,000 IEQ/kg.

Back to Top | Article Outline

HUMAN HLA-DR+CD27+ MEMORY-TYPE REGULATORY T CELLS SHOW POTENT XENOANTIGEN-SPECIFIC SUPPRESSION IN VITRO

LU CAO1,2, MIN Hu1, DANDAN Huang1, XIAOQIAN Ma1,2, LEI Sun1, ELVIRA Jimenez-Vera3, HEATHER Burns1, YUANFEI Zhao1, WAYNE Hawthorne1, SHOUNAN Yi1, and PHILIP O'Connell1

1Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 2Institue for Cell transplantation and Gene Therapy, The 3rd Xiangya Hospital, Central South University, 3Centre for Transplant and Renal Research, Westmead Hospital, Sydney

Introduction: Strategies for immunomodulation of xenograft rejection response whilst minimizing long-term immunosuppression need be developed. We have previously shown that xenoantigen-stimulation enhanced human Treg capacity to suppress the xenogeneic response. However, whether xenoantigen-expanded Treg express specific cell surface markers which can be used for separating a particular xenoantigen-specific Treg subset for an effective Treg therapy needs be identified.

Materials and Methods: Human CD4+CD25+CD127 Treg isolated from healthy donor PBMC were expanded for 3 weeks with anti-CD3/CD28 beads alone or combined with irradiated porcine PBMC as polyclonally (PlTreg) or xenoantigen-stimulated Treg (XnTreg), respectively. FACS was performed to determine candidate cell surface markers and consequent xenoantigen-specific Treg subset was isolated XnTreg by cell sorting. After sorting, the resulting Treg subset was assessed for their suppressive capacity by mixed lymphocyte reaction (MLR) using irradiated porcine PBMC as xenogeneic-stimulating cells, human PBMC as responder cells and autologous XnTreg as suppressing cells.

Results: After 3 weeks of expansion, XnTreg exhibited substantially upregulated expression of HLA-DR and CD27 with a larger proportion of them being HLA-DR+CD27+. The HLA-DR+CD27+ Treg subset from XnTreg demonstrated significantly enhanced potency in suppression of proliferating xenoreactive responder cells at ratios of 1:4 through to 1:64, or 1:32 and 1:64 of Treg:responder cells when compared to HLA-DR+CD27+ cell-depleted or unsorted XnTreg, respectively.

Conclusion: Our data suggest that human HLA-DR+CD27+ memory-type Treg are xenoantigen-specific and have potential as an effective immunotherapy in xenotransplantation.

Back to Top | Article Outline

ENCAPSULATED PIG CELLS SECRETING ANTI-HUCD2 ANTIBODY REDUCES THE NUMBER OF HUMAN CD2 CELLS LOCALLY BUT NOT SYSTEMICALLY IN HUMANIZED MIC

LOUDOVARIS T1, COWAN P2, HAWTHORNE W3, SALVARIS E2, FISICARO N2, CATTERALL T1, KOS C4, MARIANA L1, LEW A5, and KAY T1

1Immunology & Diabetes, St Vincent's Institute, Melbourne, 2Centre for Immunology, St Vincent's Hospital, Melbourne, 3Islet Transplantation Facility, Westmead Millennium Institute, Westmead Hospital, Sydney, 4Immunology & Diabetes, St Vincent's Hospital, Melbourne, 5Department of Immunology, Walter and Eliza Hall Institute of Medical Research, Melbourne

Background: The TheraCyte™ Implantable System, with outer membranes that induce the development of vasculature, was developed to encapsulate and protect cells that secrete insulin or other proteins the patient is deficient in. This implant system has been shown to be biocompatible and protective of allogeneic tissues in animal and human trials. However, the immune protection of xenogeneic tissues (as a potentially unlimited source of therapeutic tissue) has been so far unremarkable, as the intensity of the surrounding inflammatory response suffocates the encapsulated cells.

Method: To mollify or eliminate this local response, pig kidney cell line PK1 genetically engineered (pCIneo_CD2_GFP+) to secrete a monoclonal antibody to human CD2, which inhibits and depletes T cells. PK1 cells transfected with vector alone (pCIneo_GFP+) were used as controls. Encapsulated PK1_pCIneo_CD2_GFP+ or PK1_pCIneo_GFP+, co-encapsulated with porcine neonatal islet cell clusters (NICC’s), were implanted at two sites into immunodeficient NSG mice that were then reconstituted with human PBMCs to generate a human anti-pig response.

Results: There was a statistically significant decrease in the numbers of human T cells around the devices containing anti-CD2 secreting cells, compared to those with control pig cells. This occurred at both sites, while the number of huCD2 cells in the spleen were similar in all mice. We could not determine whether protection was improved or not as all encapsulated Xeno-cells including the NICC’s survived.

Conclusion: Although the xeno-protective properties of anti-CD2 could not be demonstrated, the local impact of secreted factors is a promising result, which cogently advocates further investigation.

Figure

Figure

© 2018 The Authors. Published by Wolters Kluwer Health, Inc.