INTRODUCTION OF SHARE 35 INTERREGIONAL ALLOCATION FOR HIGH MELD LIVER TRANSPLANT WAITING LIST PATIENTS IN AUSTRALIA AND NEW ZEALAND
FINK Michael1,2, GOW Paul2, BALDERSON Glenda3, and JONES Robert2,1
1Department of Surgery, University of Melbourne, 2Liver Transplant Unit Victoria, Austin Hospital, Melbourne, 3Australia and New Zealand Liver Transplant Registry
Aims: Patients with high MELD scores awaiting liver transplantation have a high risk of waiting list mortality and a short window of opportunity for rescue. A voluntary trial of sharing of livers between units in Australia and New Zealand for patients with a MELD score ≥ 35 (Share 35) was undertaken. The aim of this study is to assess the impact of the trial on waiting list mortality.
Methods: The waiting list mortality rate of patients whose MELD score reached 35 prior to commencement of the Share 35 trial, “Share 35 candidates”, was compared with that of patients listed as Share 35 patients during the period of the trial, “Share 35 listed”, using Chi square. Post-transplant survival of the two groups was compared using Kaplan-Meier graphs with log-rank.
Results: During the 21-month period of the trial, 24 patients were Share 35 listed, of whom 13 were transplanted with a shipped liver, eight were transplanted with a local donor liver and three died waiting. The waiting list mortality rate of Share 35 listed patients (3 of 24, 13%) was significantly less than that of Share 35 candidates (13 of 27, 48%, P = 0.006). Post-transplant survival was not significantly different between the groups (P = 0.420).
Conclusions: Introduction of Share 35 to Australia and New Zealand has resulted in improved access to liver transplantation for a group of patients that previously were at high risk of waiting list death without adversely affecting utility.
RISK INDICES IN DECEASED DONOR ORGAN ALLOCATION FOR TRANSPLANTATION: REVIEW FROM AN AUSTRALIAN PERSPECTIVE
LING Jonathan1,2, FINK Michael3, WESTALL Glen4, MACDONALD Peter5, CLAYTON Philip6, OPDAM Helen7, HOLDSWORTH Rhonda8, POLKINGHORNE Kevan1, and KANELLIS John1
1Department of Nephrology, Monash Medical Centre, Melbourne, 2School of Medicine, Faculty of Health Sciences, Monash University, Melbourne, 3General and Hepato-Pancreato-Biliary Surgery, Austin Hospital, Melbourne, 4Alfred Hospital, Melbourne, 5St Vincent's Hospital, Sydney, 6Department of Nephrology, Royal Adelaide Hospital, 7Organ and Tissue Authority, 8National Laboratory Manager, Australian Red Cross Blood Service, 9Department of Nephrology, Monash Medical Centre, Melbourne
Recently, organ donation and transplantation rates have increased both worldwide and in Australia. Concurrently, the Australian software used for donor and recipient data management (NOMS) is being rebuilt (called OrganMatch). As an added consequence, organ allocation processes are being reviewed. Worthwhile capabilities of the new software would include the ability to use risk indices to guide organ allocation and help streamline transplantation decisions. Risk indices comprising donor, recipient and transplant factors play an important role in organ allocation policies worldwide by assimilating pertinent data to help guide transplant clinicians.
Aims: To identify risk indices in use worldwide and contrast their use abroad with current Australian organ allocation policies.
Methods and Results: We reviewed risk indices used in organ allocation policies worldwide for kidney, liver, heart, lung and pancreas organs and their predictive capacity for post-transplant outcomes. We collated the Australian organ allocation policies for these organs and have noted the use of similar risk indices where available. Significant donor, recipient and transplant factors used in the scores were summarised.
Conclusions: Risk indices, when used together with other clinical information can assist the organ allocation process. They should not be used in isolation to make decisions regarding transplantation. Very few risk indices are currently part of the current Australian organ allocation process. Any risk index derived abroad needs to be validated in an Australian cohort before use. Modifying or adding variables in a risk index might provide an easier way to update organ allocation policies in the future.
WHAT HAPPENED WHEN THE ‘SOFT OPT-OUT’ TO ORGAN DONATION WAS IMPLEMENTED IN WALES? FAMILY AND PROFESSIONAL VIEWS AND EXPERIENCES, AND CONSENT RATES FOR THE FIRST 18 MONTHS.
NOYES Jane1,2,3, MC LAUGHLIN Leah1,2, MORGAN Karen4, WALTON Phil5, ROBERTS Abigail6, and STEPHENS Michael7
1School of Social Sciences, Bangor University, 2Wales Kidney Research Unit, Bangor University, 3National Centre for Population Health and Well being Research, 4Major Health Conditions Policy Team, Welsh Government, 5Department of Organ Donation, NHS Blood and Transplant, 6North West Regional Office, Liverpool, UK, NHS Blood and Transplant, 7Department of Nephrology and Transplantation, Cardiff and Vale University Health Board, University Hospital of Wales, Cardiff, UK.
Introduction: On 01.12.15 Wales introduced a 'soft opt-out' system of organ donation.
Methods: Co-productive, mixed-methods study partnered with National Health Service Blood and Transplant and patient and public representatives. Data were collected on all 211 approaches between 01.012.15-31.05.17: 182/211 deceased patients came under the Act. Depth data (62 interviews with 85 family members, and questionnaires) on 60 patients who were potential/actual organ donors; and 2 focus group/individual interviews with 19 NHS BT professionals [figure1]. Organ Donor Register (ODR) activity was monitored.
Results: Welsh consent rates increased by around 10% to 61%; 64% when family consent was removed. This was higher than England and has reversed an unexplained drop to 48.5% before implementation. However, family member(s) still overrode the patient’s organ donation decision 31/205 times. 46/205 cases had their consent deemed with a consent rate of 61%. The Act provided a useful framework but family members did not fully understand deemed consent. Negative personal organ donation views and health systems issues affected support for organ donation. The media campaign missed the changed role of the family; that they were no longer the decision maker about organ donation. ODR 'opt-outs' were 6%, less than anticipated.
Discussion: The media campaign mostly worked but was not memorable and had gaps. More work is needed to inform the family about their changed role. Scotland and England are now in the consultation process to move to an ‘opt-out’ system. As a result of this study Welsh Government commissioned a new campaign launched 01.11.17.
PROMOTING DECEASED DONOR ORGAN TRANSPLANTATION IN VIETNAM: WHERE TO START?
ALLEN Richard1,2, PLEASS Henry3,4, KABLE Kathy5, ROBERTSON Paul6, MACKIE Fiona7, THOMAS Gordon8, SINH Tran Ngoc9, PHAM GIA Khanh10, and TRUONG Nguyen11
1Westmead Clinical School, University of Sydney, 2Transplantation Services, Royal Prince Alfred Hospital, Sydney, 3Department of Surgery, University of Sydney, 4National Pancreas Transplant Unit, Westmead Hospital, Sydney, 5Renal & Transplantation Unit, Westmead Hospital, Sydney, 6Renal Transplant Unit, Westmead Hospital, Sydney, 7Department of Pediatrics, Prince of Wales Hospital, Sydney, 8Department of Surgery, Sydney Children's Hospital, 9Renal Transplant Unit, Cho Ray Hospital, 10Hanoi Medical University, Vietnam, 11Transplantation Services, Cho Ray Hospital, HCMC, Vietnam
Aim: Vietnam is a developing country of 93,000,000 people with central government and wide disparities in wealth, education and healthcare. We review organ transplant activity and describe challenges for deceased organ donation (DD).
Materials and Methods: National registry relies on self-reporting from 17 kidney, 3 liver and 3 heart units.
Results: 2,249 living donor (LD) transplants, 174 DBD transplants and 3 DCD transplants have been reported. No child has received a deceased adult organ. Only 3 kidney centers presented data at Vietnamese Society of Transplantation (VSOT) 2017 meeting, reporting unrelated kidney LD activity of 6.3% (Figure), 71.4% and 85.7% respectively with latter two relying on police determination that unrelated donors were not rewarded. Barriers to DD exist despite DD legislation and >12,000 annual head injury deaths. Brain death diagnosis is complex. Family consent for DD is impeded by immense clinical pressures and limited resources. Requests are cursory without consensus for organ allocation. Results are not published. Wait-listing with stored sera and environment of trust between ICU and transplant surgeons do not exist. Transplant training from Europe and SE Asia is based on surgical skills for elective procedures. Careers for transplant physicians and nurses for recipient preparation and long-term care are limited.
Conclusions: Potential exists to improve DD activity with simultaneous 1. Cost-effective local resourcing of ICUs; 2. Transparent allocation guidelines and waiting-list criteria led by VSOT (±TSANZ); and, 3. Co-ordinated international hospital partnerships. Subsequent growth of heart, liver and paediatric transplantation will enhance community and donation sector appreciation of DD.
THE WEEKEND EFFECT: AN AUSTRALIAN COHORT STUDY ANALYSING TEMPORAL TRENDS IN SOLID ORGAN DONATION
WEBSTER Angela1,2, HEDLEY James1, CHANG Nicholas1, ROSALES Brenda1, WYBURN Kate3,4, KELLY Patrick1, OLEARY Michael5, and CAVAZZONI Elena5
1School of Public Health, University of Sydney, 2Centre for Transplant and Renal Research, Westmead Hospital, Sydney, 3School of Medicine, University of Sydney, 4Renal Unit, Royal Prince Alfred Hospital, Sydney, 5Donate Life, DonateLife
Introduction: A US study suggested donation rates were poorer on weekends. We investigated the effect of day-of-week of donor referral on organ donation in Australia.
Methods: In Australia, potential donor referrals are made to a state-based Donation Service, who then simultaneously seek family consent for donation and assess the medical suitability of the referral for donation. Organ retrieval occurs when utilisation is almost certain, hence discard rates are extremely low. We retrospectively reviewed all New South Wales referral logs from 2010 to 2016. Our outcomes were actual donation (retrieval), family consent, and medical suitability. We used logistic regression with random effects adjusting for clustering of referral hospitals. We used mortality data from the Australian Institute of Health and Welfare to compare donation referrals to background mortality rates by day of the week for all-cause mortality and from motor vehicle accident deaths (MVA).
Results: Of 3,383 referrals (potential donors), 692 (20%) became actual donors. We found no evidence of reduced donation (adjusted OR: 1.15; 95% CI 0.93 – 1.42; p=0.2), consent (adjusted OR 1.07; 95% CI 0.85-1.35; p=0.6), or medical suitability (adjusted OR 1.15; 95% CI 0.96-1.39; p=0.1) among weekend referrals. The rate of donor-referral was lower on weekends compared to weekdays for all-cause mortality (p<0.001) and MVA mortality (p=0.03).
Conclusion: There was no association between day-of-the-week or weekend referrals and actual donation, family consent, or medical suitability. There was some indirect evidence that donor referrals may be more selective at weekends. These results contrast findings from the USA.
OVERCOMING BARRIERS FOR INDIGENOUS AUSTRALIANS GAINING ACCESS TO THE KIDNEY TRANSPLANT WAITING LIST
ATKINSON Amy FORD Sharon GOCK Hilton IERINO Frank and GOODMAN David
Department of Nephrology, St Vincent's Hospital, Melbourne
Aims: Aboriginals constitute 10% of the Australian dialysis population but few ever receive a kidney transplant. We studied our dialysis and transplant population to identify the main barriers to Victorian patients being listed for transplantation.
Methods: All Aboriginal patients on dialysis (n=12) or with kidney transplant (n=7) were included in the study. Information was derived from the hospital records and interview by study nurse.
Results: Twelve of 304 current dialysis patients (3.9%), mean age 59 (range 39–80), 6 male & 6 females, living in Melbourne 6 & Country Victoria 6, had mean dialysis 4.8 years (range 1–11 years). Only 1 had previously been on active list. Ten of 12 have diabetes mellitus, 5 ischaemic heart disease, 4 ex-IVDU, 2 mental illness, 2 BMI>35, 1 foot ulceration, 1 osteomyelitis, 1 bacterial endocarditis, 1 recurrent pneumonia and 1 recent colon cancer. Only 1 patient, a smoker/drug user regularly missed dialysis. One patient has declined transplant work up and another had previously done so. Seven of 265 receiving a kidney transplant over the past 10 years (2.6%) waited on average 5 years from dialysis commencement to transplantation.
Conclusion: Medical co-morbidities including heart disease, infection and psycho-social issues are the main barriers to transplant listing. Concerted efforts to manage medical issues involving a multidisciplinary team of transplant physicians and nurses, GP’s, Aboriginal liaison officers and social workers may allow more Aboriginals to be listed for transplantation. Once listed the current organ match system appears to provide equal access to kidneys for all Australians.
EXPLORING THE IMPACT OF RECIPIENT AGE WITH KIDNEY DONOR RISK INDEX AND ESTIMATED GLOMERULAR FILTRATION RATE AT ONE YEAR FOLLOWING KIDNEY TRANSPLANT
CHAN Samuel1,2,3, CHATFIELD Mark4, and BABOOLAL Keshwar1,2
1Department of Nephrology, Royal Brisbane Hospital, 2Department of Medicine, University of Queensland, Brisbane, 3ANZDATA, 4Statistics Unit, QIMR Berghofer Medical Research Institute
Background: Various kidney donor risk indexes (KDRI) have been developed to predict graft survival with various combinations of donor and recipient characteristics. The aim of this study was to;
1. Explore relationships between Rao KDRI and recipient estimated glomerular filtration rate (eGFR) at 1yr
2. Examine the impact of recipient age on eGFR at 1yr
Methods: A retrospective analysis of deceased donor and recipient data from Australian, New Zealand, UK and USA Organ Donor Registries was conducted from 2000-2015. KDRI and recipient age was categorised into four and six groups, respectively. Median eGFR was calculated for the 24 combinations of age and KDRI.
Results: Overall, there were 6,512 Australian, 851 New Zealand, 21,077 UK and 157,664 USA recipients (median age 52yrs, [IQR 41-61]). Recipients aged <30yrs receiving a good-quality kidney (KDRI<1.0) achieved a higher median eGFR at 1yr (87.4ml/min/1.73m2) compared with other age groups (median eGFR range: 56.0-63.3ml/min/1.73m2). Recipients aged <30yrs receiving an average-quality kidney (KDRI 1.0-1.5) yielded a better median eGFR of 67.1ml/min/1.73m2 compared with other age groups (median range: 47.8-52.6 ml/min/1.73m2). Recipients aged <30yrs receiving a marginal-quality kidney (KDRI 1.5-2.0) achieved a median eGFR of 53.7ml/min/1.73m2 compared with other age groups (median range: 39.2-44.4ml/min/1.73m2). Recipients aged <30yrs receiving a poor-quality kidney (KDRI>2.0) yielded a better median eGFR of 45.3ml/min/1.73m2 compared with other ages (range: 35.1-36.2 ml/min/1.73m2).
Conclusions: As KDRI increases, eGFR decreases. Recipients aged <30yrs achieved a substantially higher eGFR at 1yr, independent of donor quality. Only small differences in median eGFR were seen between other age groups.
NORMOTHERMIC MACHINE PERFUSION OF NON-UTILIZED HUMAN KIDNEYS – OUR FIRST TWO CASES
HAMEED Ahmer1,2, ROGERS Natasha1,3, ROO Ronald DE2, Bo LU1, ROBERTSON Paul3, ZHANG Chris3, GASPI Renan3, MIRAZIZ Ray4, NGUYEN Hien2, YUEN Lawrence2,5, ALLEN Richard2,1,5, HAWTHORNE Wayne1,2, and PLEASS Henry2,5
1Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 2Department of Surgery, Westmead Hospital, Sydney, 3Department of Renal Medicine, Westmead Hospital, Sydney, 4Department of Anaesthetics, Westmead Hospital, Sydney, 5School of Medicine, University of Sydney
Aims: Normothermic machine perfusion (NMP) is an emerging modality that may improve graft function of higher KDPI kidneys and/or reduce kidney discard rates. We aimed to test this modality in a series of human kidneys deemed unsuitable for transplantation.
Methods: The first kidney was from a 74 year-old male proceeding down the DBD pathway (KDPI 96%). Both kidneys were not utilized due to suspected intra-abdominal malignancy noted during the donation procedure. The second kidney was from a 71 year-old female proceeding down the DCD pathway (KDPI 89%). The right kidney was deemed unsuitable for transplantation due to very poor/patchy perfusion. Both kidneys were transported to our centre in standard cold preservation solution for subsequent NMP. Perfusion parameters, renal function, and histology (haematoxylin & eosin [H&E] section) were compared over this time period.
Results: NMP was performed for 3 hours using the left kidney from the first donor; the second donor’s kidney underwent NMP for 2 hours. Both kidneys displayed a significant improvement in perfusion parameters over time, with a drop in intra-renal resistance and increase renal blood flow. The grossly non-perfused regions in the DCD kidney were no longer evident. Both kidneys produced urine (2 ml/hr, donor 1; 22 ml/hr, donor 2). Creatinine clearance and tubular function (FeNa) improved over time, especially in donor 2. Sequential histology revealed no significant deterioration in renal tubular and glomerular cyto-architecture after 2-3 hrs of NMP.
Conclusions: NMP is a promising modality that has the potential to resuscitate grafts and thereby maximize kidney utilization.
CHALLENGES TO PROCEEDING TO ORGAN DONATION IN THE NORTHERN TERRITORY
MCAULIFFE Kathryn WOOD Lee and JONES Sarah
DonateLife NT, DonateLife
Aims: Organ donation in the Northern Territory (NT) remains infrequent despite extensive efforts to improve community education and awareness. We aimed to examine the challenges to it occurring.
Methods: All referrals to the DonateLife NT agency between January 2014 and December 2017 were reviewed. Referral numbers increased year on year. We looked at consent rates, ethnic group, registration on the Australian Organ Donor Register (AODR) and reasons for referrals not proceeding.
Results: There were 142 referrals over the four year period. The mean age was 48.6 years (range 2 months to 82 years). Sixty-three (44.3%) of referrals were Indigenous, 54 were Caucasian (38%) and 25 (17.6%) were from a different culturally and linguistically diverse (CALD) background. Of the 142 referrals, only 55 (38.7%) proceeded to the Family Donation Conversation (FDC). Consent for organ donation was obtained from 25 (45%), 20 of whom became organ donors. There were 5 intended donors. Only 14.5% (8) of all referrals that proceeded to FDC were registered on the Australian Organ Donor Register (AODR). Of the 87 referrals that did not proceed to FDC, 45 (51.7%) were deemed either medically unsuitable or medically unsupportable. Diabetes, hypertension and hazardous alcohol use were common comorbidities amongst medically unsuitable patients.
Conclusions: Organ donation poses many challenges within the NT which require ongoing attention. Although patients are young, medical suitability issues often prevent conversations about organ donation from taking place. Registration rates on the AODR are also low.
IMPACT OF A DEDICATED LIVING DONOR CLINIC AND ASSESSMENT TEAM: A SINGLE CENTRE EXPERIENCE
SANDIFORD Megan1, COOK Natasha1, WHITLAM John1, CHAN Yee2, KAUSMAN Joshua3, IERINO Frank4, and LEE Darren3,4,1,5
1Department of Nephrology, Austin Hospital, Melbourne, 2Department of Urology, Austin Hospital, Melbourne, 3Department of Nephrology, Royal Children's Hospital, Melbourne, 4Department of Nephrology, St Vincent's Hospital, Melbourne, 5Eastern Health, Melbourne
Aims: International guidelines recommend independent assessment of living kidney donor candidates (LKDC) by nephrologists not involved in the care or evaluation of the intended recipients. Whether this approach might have a negative impact on the determination of suitability and timely living donor (LD) transplant is unclear. We examined the efficiency of LKDC assessment before and after the establishment of a dedicated LD clinic staffed by a LD coordinator and two nephrologists.
Methods: We retrospectively compared the number of renal clinic appointments attended by LKDC to determine medical suitability and proportion of pre-emptive LD transplants pre-LD clinic (January 2006 to October 2009) and post-LD clinic (November 2009 to Oct 2017) establishment at Austin Health. LKDC with part of their assessment performed elsewhere were excluded.
Results: In the post-LD clinic, fewer clinic appointments were required to determine suitability for both accepted and declined LKDC (Table). For accepted LKDC, a further reduction in clinic appointments was observed in the second versus first half of the post-LD clinic era (median 2 (IQR 2-3) vs 4 (3-4.75); P<0.001), suggesting a learning curve for improvement. An increase in pre-emptive LD transplant rate also occurred (46.7% vs 14.3%). The likelihood of LKDC being accepted and LD transplant rate did not significantly change over the two eras.
Conclusions: Our experience suggests that LD clinic staffed by a dedicated assessment team improves the efficiency of LKDC assessment and facilitates pre-emptive LD transplantation. This allows the development of expertise and quality improvement without altering the acceptance threshold for medical suitability.
CLINICIAN’S ATTITUDES AND PERSPECTIVES ON THE ACCEPTABILITY OF ANTE-MORTEM INTERVENTIONS: AN INTERNATIONAL SEMI-STRUCTURED INTERVIEW STUDY
SHAHRESTANI Sara1,2, HAWTHORNE Wayne1,3, PLEASS Henry3, WONG Germaine4, and TONG Allison5
1Centre for Transplant and Renal Research, Westmead Hospital, Sydney, 2School of Medicine, University of Sydney, 3Department of Surgery, Westmead Hospital, Sydney, 4Department of Renal Medicine, Westmead Hospital, Sydney, 5Centre for Kidney Research, The Children's Hospital at Westmead, Sydney
Background: The use of ante-mortem interventions in transplantation remains contentious due to ethical concerns and potential for harm to donors. There is variability in the acceptance and use of ante-mortem interventions across individual centers.
Methods: We conducted semi-structured interviews with 42 clinicians (transplant physicians, surgeons, ICU physicians, and donation specialist nurses), purposively sampled from eight countries including Australia, Italy, Japan, Korea, United Kingdom, United States, New Zealand, and Vietnam. We used thematic analysis to analyse the data.
Results: Four themes were identified: respecting the donor family’s experience of grief; optimising ‘the gift’ as a duty to the donor; ambiguity in operationalising ‘informed’ consent, and fears of harming the donor. Participants feared burdening the grieving family with organ donation, in often-traumatic circumstances and the donation specialist role was necessary for sensitive discussion of wishes. Clinicians felt a tension between their duty to enact donor wishes, protect donors as ‘patients in their own rights,’ and prevent ‘unsuccessful’ transplantation. The complete dissemination of information for consent in a time-pressured and emotionally-charged context was described as unrealistic. Instead, the legal concept of ‘authorisation,’ with less onus of information, was raised. The principle of ‘first do no harm’ applied to the potential harms of interventions and adhering to the donor’s wishes. Conclusions: Respect for the rights and wishes of donors, minimisation of harm and optimisation of ‘the gift’ were paramount to clinicians. Clarity around what constitutes ‘benefit’ and ‘harm,’ along with informed discussion with families, will help clinicians resolve tensions regarding the acceptability of interventions in the donation process.
EXTENDED CRITERIA DONATION UNDER EXTENDED CRITERIA CIRCUMSTANCES
THOMPSON Sophie1, PILCHER David2, and IHLE Joshua3
1DonateLife Victroia, Alfred Hospital, Melbourne, 2DonateLife Victoria, Alfred Hospital, Melbourne, 3Intensive Care Unit, Alfred Hospital, Melbourne
Introduction: In Donation after Circulatory Death (DCD), withdrawal of cardiorespiratory support (WCRS) usually begins with extubation. Death is most commonly determined after a period of 5 minutes’ observation of a non-pulsatile invasive arterial pressure trace. We present a case of a patient who wished to be a donor but had neither invasive arterial monitoring, nor mechanical ventilation.
Case presentation: A 61-year-old woman was admitted to the Intensive Care Unit for management of respiratory failure following a lung transplant. After a prolonged hospital stay, the patient’s care transitioned to end of life treatment. Consent was obtained from the senior available next of kin for organ donation and a rapid retrieval DCD protocol was activated. The patient was receiving oxygen via High Flow Nasal Prongs (HFNP) and was monitored only via a pulse oximeter. 23 minutes after removal of HFNP, oxygen saturation was no longer recordable. 31 minutes later the treating Intensivist examined the patient to confirm death. Time from removal of HFNP to cold perfusion of the organs was 54 minutes. Both kidneys were subsequently transplanted into recipients who are recovering well.
Discussion: This was a unique situation where an extended criteria donor was able to donate organs due to a combination of being registered, family being aware and supportive of her wishes in a situation where donation would not usually be considered. Dependence on HFNP enabled death to occur in a timely fashion after WCRS, resulting in kidney donation via a rapid retrieval DCD pathway.
ALLOCATION OF DECEASED DONOR KIDNEYS IN CLINICAL PRACTICE: MATCHING GRAFT LIFE-YEARS AND RECIPIENT LIFE EXPECTANCY
YONG Bryan1,2,3, IERINO Frank2, PAIZIS Kathy1, and POWER David1
1Renal & Transplantation Unit, Austin Hospital, Melbourne, 2Renal & Transplantation Unit, St Vincent's Hospital, Melbourne, 3School of Medicine, University of Melbourne
Introduction: The current allocation of deceased donor kidney organs in the Australia National Organ Matching System (NOMS) does not match expected graft life-years to patient life-expectancy. Such longevity mismatches between donor grafts and recipients may result in loss of potential graft life-years, which is known to confer improved recipient life-expectancy. Evaluating the matching of graft and recipient longevity with the current NOMS allocation algorithm is an essential step towards the formal introduction of donor-recipient matching.
Aims: To determine the correlation between deceased donor kidney graft longevity and recipient life expectancy.
Methods: Adult deceased donor kidney transplants (n = 125) from a single centre from 2011 to 2015 were examined retrospectively. Two validated clinical calculators, the Australian Kidney Donor Profile Index (KDPI) and Expected Post-Transplant Survival (EPTS) were used to predict graft and patient survival respectively. Data for the KDPI and EPTS parameters were collected from the clinical notes and transplant registry databases. The statistical relationship between KDPI and EPTS scores was then correlated using the Spearman rank correlation. Results: A statistical analysis between KDPI and EPTS demonstrated a poor correlation with a Spearman rank correlation of 0.179 (CI 95% 0.006-0.361, p-value = 0.046).
Conclusion: Current practices do not optimise the matching of organ expected graft life-years to recipient life expectancy, leading to loss of potential graft life-years. This supports the need to improve longevity matching of the donated kidney to the estimated life expectancy of the potential recipient.
THE ROLE OF THE CD73/A2A SIGNALLING AXIS IN A HUMANISED MOUSE MODEL OF GRAFT-VERSUS-HOST DISEASE
GERAGHTY Nicholas1,2,3, ADHIKARY Sam1,2,4, SLUYTER Ronald1,2,3, and WATSON Debbie1,2,3
1School of Biological Sciences, University of Wollongong, 2Centre for Medical and Molecular Biosciences, University of Wollongong, 3Illawarra Health and Medical Research Institute, University of Wollongong, 4Illawarra Health and Medical Research Institute
Graft-versus-host disease (GVHD) is a complication that occurs in approximately 50% of bone marrow transplantations, due to donor leukocytes (predominantly T cells) in the graft mounting an immune response against the patient (host). Extracellular adenosine, generated by the ecto-enzyme CD73, activates A2A to limit T cell responses. CD73 or A2A blockade worsens disease, while A2A activation reduces disease in allogeneic mouse models of GVHD.
Aim: The current study aimed to investigate the role of the CD73/A2A signalling axis in a humanised mouse model of GVHD.
Methods: NOD-SCID-IL2γnull (NSG) mice injected with 10 x 106 human (h) peripheral blood mononuclear cells (PBMC), were subsequently injected with αβ-methyleneADP (APCP) (CD73 antagonist) or CGS21680 (A2A agonist) or control diluent for 14 days. GVHD development was assessed by weight loss, clinical parameters, and survival. The impact of APCP and CGS21680 on immune cells and cytokines were investigated.
Results: CD73 blockade enhanced weight loss but did not alter clinical score or survival. CD73 blockade increased serum human interleukin (IL)-2 concentrations. A2A activation increased weight loss, but did not impact clinical score or survival. CGS21680 led to a decrease in immunosuppressive regulatory T cells, however serum tumor necrosis factor (TNF)-α and IL-2 were reduced, and IL-6 was increased.
Conclusion: A2A activation represents a potential therapeutic target for GVHD due to reduced inflammation, but should be carefully considered due to the negative effects on weight loss and regulatory T cells. Therefore, further investigation into A2A activation is warranted before it can be used as a therapeutic strategy for GVHD in humans.
CHANGES IN THE EXTRACELLULAR MATRIX - SIGNS OF REMODELING LEADING TO CHRONIC REJECTION AFTER LUNG TRANSPLANTATION
MULLER Catharina1, HEINKE Paula1, ANDERSSON-SJÖLAND Annika1, SCHULTZ Hans Henrik2, ANDERSEN Claus3, IVERSEN Martin2, WESTERGREN-THORSSON Gunilla1, and LEIF Eriksson1
1Experimental Medical Science, Lund University, Sweden, 2Section for lung transplantation, Copenhagen University Hospital, Denmark, 3Department of pathology, Copenhagen University Hospital, Denmark
Background: About 50% of lung transplanted patients develop chronic rejection in the form of bronchiolitis obliterans syndrome (BOS) within 5 years after transplantation. BOS is characterized by a decrease in lung function, caused by progressive fibrosis. However, little is known about its initiation. We hypothesize that changes in the distribution of extracellular matrix proteins might be a marker for the disease process.
Methods/Material: Our study aimed to map total collagen, collagen type IV, biglycan and periostin in transbronchial biopsies taken at 3 and 12 months after transplantation using Masson’s Trichrome staining and immunohistochemistry. Staining patterns were quantified and related to patient data (n=58) in a 5-years follow-up.
Results: Compartment specific patterns could be revealed between 3 and 12 months post-transplantation. Alveolar total collagen (p=0.019) and small airway biglycan (p=0.02) increased in BOS-developing patients. Alveolar collagen type IV increased in BOS-free patients (p=0.01) (3 vs. 12 months). Individual calculation of the change in protein content (12 minus 3 months for the respective patient) confirmed the increase in biglycan (p=0.012) and showed a trend for increased periostin (p=0.057) in the small airways of BOS patients compared to BOS-free patients. Already at 3 months, before onset of BOS, increased total alveolar collagen (p=0.036) and small airway collagen type IV (p=0.034) could discriminate between patients developing less severe and severe forms of BOS (BOS grade 1+2 vs. 3).
Conclusion: The results show distinct alterations of the extracellular matrix which might be part of the complex remodeling processes that eventually lead to BOS.
MTORC2 DEFICIENCY IN DENDRITIC CELLS PROMOTES ACUTE KIDNEY INJURY
ROGERS Natasha1,2, DAI Helong2, WATSON Alicia2, and THOMSON Angus2
1Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 2Starzl Transplant Institute, University of Pittsburgh
Introduction: The role of the mammalian/mechanistic target of rapamycin (mTOR) in the pathophysiology of acute kidney injury (AKI) is poorly characterized. Furthermore, the influence of dendritic cell (DC)-based alterations in mTOR signalling in AKI has not been investigated.
Methods: Bone marrow-derived mTORC2-deficient (Rictor-/-) or wild-type (WT) DC underwent hypoxia-reoxygenation and were analysed by flow cytometry. Age- and gender-matched DC-specific Rictor-/- mice or littermate controls underwent bilateral renal ischemia-reperfusion injury followed by assessment of renal function, histopathology, renal DC metabolism, bio-molecular and cell infiltration analysis. Adoptive transfer of WT or Rictor-/- DC to C57BL/6 mice was used to assess migratory capacity.
Results: AKI upregulates expression of phospho-S6K (downstream of mTORC1), but downregulates phosphorylated Akt S473 (downstream of mTORC2) in whole kidney tissue. Rictor-/- DC expressed more CD80/CD86 but less programmed death ligand-1 (PDL1) that was enhanced by hypoxia-reoxygenation, and demonstrated enhanced migration to the injured kidney. Following AKI, Rictor-/-DC mice developed higher serum creatinine, more severe histologic damage, and greater pro-inflammatory mRNA transcript profiles of IL-1β, IL-6 and TNF-α compared to littermate controls. A greater influx of neutrophils and T cells was seen in Rictor-/- DC mice, in addition to CD11c+MHCII+CD11bhiF4/80+ renal DC, that expressed more CD86 but less PDL1. Rictor-/- DC showed increased TNF-α but significantly reduced IL-10 production, and were glycolytically biased compared to WT DC under both basal and AKI conditions.
Conclusions: These data suggest that mTORC2 signaling in DC negatively regulates AKI, highlighting the regulatory roles of both DC and Rictor in the pathophysiology renal injury.
TISSUE-RESIDENT LYMPHOCYTES IN SOLID ORGAN TRANSPLANTATION
PROSSER Amy1,2, HUANG Wen Hua3, LIU Liu1, LARMA-CORNWALL Irma4, JEFFREY Gary1, GAUDIERI Silvana2, DELRIVIERE Luc5,3, KALLIES Axel6, and LUCAS Michaela1,7
1Medical School, University of Western Australia, Perth, 2School of Anatomy, Physiology and Human Biology, University of Western Australia, Perth, 3School of Surgery, University of Western Australia, Perth, 4Centre for Microscopy, Characterisation and Analysis, University of Western Australia, Perth, 5WA Liver & Kidney Transplant Service, Sir Charles Gairdner Hospital, Perth, 6Department of Microbiology and Immunology, The Peter Doherty Institute for Infection and Immunity, 7Department of Immunology, Sir Charles Gairdner Hospital, Perth
Introduction: Solid organ transplantation is the standard treatment option for many patients with end-stage diseases. Despite improvements in short-term outcomes, long-term organ graft survival has remained poor for the past two decades. Newly characterised tissue-resident lymphocytes are suspected to play a significant role in graft survival and rejection, although their function in transplantation has not yet been tested. Similarly, the contribution of donor- and recipient-derived lymphocytes to allograft survival and rejection has not been investigated.
Methods: We have performed orthotopic liver transplants in either congenic or MHC mismatched mice. At various timepoints up to one month post-surgery, rejection of the organ was scored histologically and donor- and recipient-derived lymphocytes were analysed in the graft and peripheral organs by flow cytometry. The maintenance and differentiation to a tissue-resident phenotype of various cellular subsets was also assessed.
Results: Tissue-resident lymphocytes were successfully transplanted with the liver, with long-term survival of these cells observed only in a congenic transplantation context. MHC mismatch of donor and recipient mice, however, led to severe rejection and rapid depletion of most donor cells. Vast numbers of recipient lymphocytes also quickly infiltrate the allograft and upregulate markers associated with tissue-residency.
Conclusions: Donor-derived tissue-resident lymphocytes in the murine liver are readily transferrable with whole liver transplantation. Depletion of these cells in MHC mismatched transplants and infiltration of recipient lymphocytes differentiating to a tissue-resident phenotype coincided with severe rejection of the allograft. This suggests tissue-residency of lymphocytes, whether donor- or recipient-derived, is important in the context of solid organ rejection.
OUTCOMES OF WESTERN AUSTRALIAN LUNG TRANSPLANT RECIPIENTS – THE FIRST DECADE
DHILLON Sarbroop1, MCKINNON Elizabeth2, MUSK Michael3, WROBEL Jeremy3, LAVENDER Melanie3, and GABBAY Eli4
1Fiona Stanley Hospital, 2Institute for Immunology and Infectious Diseases, Murdoch University, 3Lung Transplant Service, Fiona Stanley Hospital, 4School of Medicine, University of Notre Dame
Background: Lung transplantation has evolved into an effective treatment option for end-stage lung disease. Growing local demand prompted the establishment of the Advanced Lung Disease Unit at Royal Perth Hospital in Western Australia in 2004. Operating now for just over a decade and recently relocated to Fiona Stanley Hospital, we sought to assess our recipient characteristics and outcomes, and compare ourselves to the international standard.
Method: Basic characteristics of all transplant recipients between 2004 and 2015 were collected at the time of transplant. This data was retrospectively augmented from our electronic hospital medical records system. Survival analysis was performed using the Kaplan-Meier method.
Results: A total of 115 lung transplants were performed. Transplant rates have trended upwards over the years, with 20 lung transplants performed in 2015. Half the recipients were over the age of 50. The most common indications for transplant, each accounting for a quarter of total transplants, were Cystic Fibrosis, Interstitial Pulmonary Fibrosis and Chronic Obstructive Pulmonary Disease. Overall survival rates were 96% at 3 months, 93% at 1 year, 84% at 3 years, and 70% at 5 years (Figure 1). This compares well to international survival rates, published by the International Society of Heart and Lung Transplantation, of 89% at 3 months, 80% at 1 year, 65% at 3 years, and 54% at 5 years.
Conclusion: Lung transplants rates continue to rise and our patients enjoy international standard outcomes.
ALLOGRAFT OUTCOME FOLLOWING RETRANSPLANTATION OF PATIENTS WITH FAILED FIRST KIDNEY ALLOGRAFT ATTRIBUTED TO NON-ADHERENCE
MANICKAVASAGAR Revathy1, WONG Germaine2,3,4, and LIM Wai H5,6
1Renal Transplant Unit, Sir Charles Gairdner Hospital, Perth, 2Centre for Transplant and Renal Research, Westmead Hospital, Sydney, 3Centre for Kidney Research, The Children's Hospital at Westmead, Sydney, 4School of Public Health, University of Sydney, 5Renal & Transplantation Unit, Sir Charles Gairdner Hospital, Perth, 6School of Medicine & Pharmacology, University of Western Australia, Perth
Background: It remains unknown whether first allograft failure secondary to non-adherence leads to increased risk of allograft failure following retransplantation.
Aim: To determine the association between causes of first allograft failure and outcomes following retransplantation.
Materials and Methods: Using the ANZDATA Registry, patients who had received a second kidney transplant between 1960-2014 were included. The association between causes of first allograft failure, death censored graft failure (DCGF) and non-adherence-related DCGF following retransplantation were examined using Cox regression and competing risk analyses.
Results and Discussion: Of 2822 patients who have received second kidney allografts, 59 (2%) lost their first allografts from non-adherence. Patients who had non-adherance-related first graft failure were younger at the time of first allograft failure (median 25 vs 38 years, p<0.001) and had significantly longer waiting times for retransplantation (waiting time >5 years: 57% vs. 20%, p<0.001) compared with those who lost their first graft from other causes. The adjusted HR for DCGF was 0.76 (95%CI 0.44, 1.32; p=342) for those who had lost their first allograft from non-adherence. Following retransplantation, the adjusted subdistribution HR of second allograft failure attributed to non-adherence for patients who had experienced non-adherence-related first allograft failure was 2.84 (95%CI 0.83, 17.79; p=0.082).
Conclusion: In patients who had experienced non-adherence-related first allograft failure, the long-term risk of DCGF in the second allograft was similar to those who had lost their first allografts from other causes. Non-adherence-related allograft failure should not be considered a contraindication to successful retransplantation.
PROPHYLACTIC PLASMA EXCHANGE IS ASSOCIATED WITH A HIGH INCIDENCE OF AMR IN SENSITISED RECIPIENTS
CHAMBERLAIN AJ1, SNIDER J2, POWER DA2, and JB WHITLAM2
1Austin Hospital, Melbourne, 2Department of Nephrology, Austin Hospital, Melbourne
Aims: Thresholds for peri-operative plasma exchange (PPEX) in recipients with donor specific antibody (DSA) and negative crossmatch are not clear. We sought to review indications for and outcomes following PPEX at our centre.
Methods: All adult kidney transplant recipients who received PPEX between 2012 and 2016 were identified. Demographic, immunologic, clinical and plasma exchange treatment data were collected from the clinical record.
Results: 63/251 (24%) recipients received PPEX. Indications were DSA (78%), ABO incompatibility (14%), DSA+ABO incompatibility (3%), and other (5%). Of 51 recipients with DSA who received PPEX, number of DSAs was 1 (57%), 2 (31%), 3 (10%) and 4 (2%). DSA target was class I (50%), class II (36%) and class I+II (14%). The median maximum recipient DSA mean fluorescence intensity (MFI) was 1659 (IQR 1090-2789). 57% of maximum DSA MFI were < 2000. The median number of PPEX treatments was 6 (IQR 4-8). 43% developed antibody mediated rejection (AMR) at median of 40 (IQR 9-269) days post-operatively. Development of AMR was not predicted by DSA number, class or MFI. Time to AMR was predicted by DSA class (class I+II = 6 days, IQR 5-9; class II 15, IQR 9-123; class I 40, IQR 19-207; p=0.03), but not DSA number or MFI.
Conclusions: In this cohort of kidney transplant recipients who received PPEX for relatively low risk HLA sensitisation, development of AMR was common and not predicted by traditional indicators for PPEX. The optimal use of PPEX in this setting is yet to be defined.
POST-TRANSPLANT SURVIVAL IN TYPE 1 DIABETICS IN AUSTRALIA AND NEW ZEALAND
WEBSTER Angela1,2, HEDLEY James1, and Patrick KELLY1
1School of Public Health, University of Sydney, 2Centre for Transplant and Renal Research, Westmead Hospital, Sydney
Introduction: We analysed data from the Australian and New Zealand Pancreas and Islet Transplant Registry (ANZIPTR) as well as the Australian and New Zealand Dialysis and Transplant Registry (ANZDATA) to estimate differences in transplant and patient survival by transplant type among recipients with type 1 diabetes.
Materials and Methods: We conducted an inception cohort study from 1984-2012, using data linkage of ANZIPTR and ANZDATA. We compared kidney graft and patient survival from the date of transplant for SPK and deceased-donor KTA recipients using Cox regression, and censored patients at last known follow-up. We adjusted for age, sex, state/country, previous transplants, age difference between recipient and donor, and immunosuppression used. To meet the proportional hazards assumption we stratified by era (1984-1999, 2000-2012).
Results: We included 1,090 transplant recipients (462 SPK, 493 deceased donor kidney, 135 living donor kidney). SPK had improved kidney survival compared to deceased donor KTA; including death with function (graft loss HR 0.35; 95% CI 0.21-0.57; p<0.001) and censored for death (graft loss HR 0.45; 95% CI 0.22-0.90; p=0.02). Patient survival was also better among SPK recipients compared to deceased donor KTA (death HR 0.48; 95% CI 0.24-0.95; p=0.03).
Conclusion: Overall, patient and kidney transplant survival has improved over time for SPK and KTA recipients. At 5 years, patient survival is >90%, and kidney transplant survival >80%. The diminishing advantage of SPK over KTA may reflect selection bias compared to earlier years when SPK donors were scarcer and PAK was more common.
TUMOUR RESECTED KIDNEY GRAFTS FOR TRANSPLANTATION IN WESTERN AUSTRALIA: OUCTOMES OF THE TRK PROGRAM, 10 YEARS ON
APIKOTOA Sharie1, and Bulang HE1,2
1WA Liver & Kidney Transplant Service, Sir Charles Gairdner Hospital, Perth, 2School of Medicine, University of Western Australia, Perth
Introduction: Kidney transplantation is the definitive treatment for end stage renal failure, and chronic organ shortage is a global issue. The Tumour Resected Kidney (TRK) program was implemented in Western Australia, 2007. The aim of this study is to review the outcomes of the TRK transplantation in patients enrolled in the program by the Western Australian Kidney Transplant Service (WAKTS).
Materials and Methods: Data was prospectively collected using a registry of all selected patients receiving TRK transplantation between Feb 2007 and February 2017. Twenty-seven patients received a TRK transplant. Follow up was from 2-10 years with a median of 7 years. Data was analysed regarding patient and graft survival, surgical complications, kidney graft function and tumour recurrence.
Results: There were 27 TRK transplanted in patients with an age range between 32-76 years (average 63 years). The tumour size ranged from 1-4cm (mean 2.7cm) with histopathology confirming renal cell carcinoma (RCC) in 20 kidneys, 1 chromophobe tumour, 3 papillary RCC and 4 benign tumours. Complications included urine leakage in 3 patients, requiring prolonged drainage, 1 non-functional graft, 1 graft loss and 1 pseudoaneurysm formation. The Graft function was satisfactory with the average creatinine at 135 μmol/L. There has been no tumour recurrence during follow-up.
Conclusion: The outcome of transplants by using TRK has shown to be satisfactory. The process of cold perfusion and preservation may help prevent the tumour recurrence. It is an option in the selected recipients under strict criteria.
EPIDEMIOLOGY AND ESTIMATED COST OF COMPLICATED SIMULTANEOUS PANCREAS KIDNEY TRANSPLANTATION
Joshua XU1, HITOS Kerry2,1, HORT Amy2, SHAHRESTANI Sara2, ROBERTSON Paul2,1, YUEN Lawrence2,1, RYAN Brendan2,1, ROO Ronald DE2,1, HAWTHORNE Wayne3,4, and Henry PLEASS2,1
1Westmead Clinical School, University of Sydney, 2Department of Surgery, Westmead Hospital, Sydney, 3Discipline of Surgery, Sydney Medical School, University of Sydney, 4Centre for Transplant and Renal Research, Westmead Institute of Medical Research
Background: Simultaneous pancreas-kidney transplantation (SPK) is a well-established treatment for type 1 diabetes mellitus and end-stage renal disease. Limited detail on cost-effectiveness exists for transplantation and treatment complexity specifically relating to surgery and complications.
Aims: To examine the cost associated with SPK transplantation for in-hospital admissions based on co-morbidities, complications and procedure complexity.
Methods: 234 SPK transplantations were reviewed at Westmead Hospital (2008-2017). Donor and recipient demographic details, co-morbidities, operative characteristics, hospital and ICU length of stay (LOS), enteric leaks and graft thrombosis were collected. Estimated DRG price weights and national weight activity units (NWAU) were used to calculate in-patient costs (Australian dollars (AUD)).
Results: Median donor and recipient age was 26 years (IQR:19-34) and 39 years (IQR:34-44) respectively. Median donor BMI was 24.2 kg/m2 (IQR:21.9-25.6) and 24.2 kg/m2 (IQR:21.7-27.7) for recipients. Median in-hospital LOS was 9 days (IQR:8-12). Overall, 17% of recipients experienced graft thrombosis and 5.2% enteric leaks. Complications such as sepsis, haemodialysis, enteric leaks and ICU admission increased the in-hospital and surgery costs per patient by more than $49,237 AUD. Greater complexity such as an increase in ICU LOS and factors like volume depletion, infection, thrombosis, hypertension, hyperkalemia, osteoporosis, re-operation and asthma increased the additional in-hospital per patient cost by more $153,841 AUD compared to uncomplicated cases.
Conclusions: Treatment complexity, co-morbidities, ICU LOS, enteric leaks, graft thrombosis and re-operation influences in-hospital only costs greatly. This economic impact is further amplified when wages, patient assessments, organ retrieval, pharmaceutical needs, monitoring and long term follow-up costs are added.
LONG-TERM GRAFT SURVIVAL AND FUNCTION IN RECIPIENTS OF DCD COMPARED TO DBD RENAL ALLOGRAFTS: A SINGLE CENTRE REVIEW.
SALTER Sherry1, TAN Sarah1, MULLEY William2,3, CHAMBERLAIN Stacey1, POLKINGHORNE Kevan2,3, SAUNDER Alan1, and KANELLIS John2,3
1Department of Surgery, Monash Medical Centre, Melbourne, 2Department of Nephrology, Monash Medical Centre, Melbourne, 3Centre for Inflammatory Diseases, Department of Medicine, Monash University, Melbourne
Aim: We previously described our short-term outcomes for DCD compared with DBD renal allograft recipients and now sought to extend those comparisons for long-term patient survival, graft survival and graft function between these groups.
Methods: Retrospective cohort study. All patients receiving a renal transplant from a deceased donor at our centre between 1 January 2010 to 30 April 2013 were included. Multi-organ transplant recipients were excluded. Baseline patient characteristics were compared. Graft and patient survival and mean eGFRs were compared between DCD and DBD recipients using the log-rank test and the student t-test respectively.
Results: The group comprised 91 DBD and 39 DCD recipients. The median follow-up was 6.2 years (range 4.7 to 8.0 years). There were no differences in donor or recipient age between groups however there was more delayed graft function in the DCD group. DCD kidney recipients had a longer length of admission (9.5 ± 8.3 days vs 11.6 ± 5.1 days (Rank Sum P<0.01). Mean eGFR was significantly lower in the DCD group until 2 months post-transplant where after there were no differences to 8 years. There were 16 recipient deaths and 5 graft losses during the study period. Patient and graft survival (Figure) were not different between groups.
Conclusion: Equivalent outcomes in the longer term can be achieved with DCD and DBD renal allografts with similar donor characteristics. Early differences in the rate of DGF and renal function did not result in differences in graft survival or renal function after the first 2 post-transplant months.
STROKE MORTALITY IN KIDNEEY TRANSPLANT RECIPIENTS: A POPULATION-BASED COHORT STUDY USING DATA LINKAGE
DE LA MATA Nicole1, MASSON Philip2, AL-SHAHI SALMAN Rustam3, KELLY Patrick1, WEBSTER Angela1,4
1Sydney School of Public Health, University of Sydney, 2Department of Renal Medicine, Royal Free London NHS Foundation Trust, 3Centre for Clinical Brain Sciences, University of Edinburgh, 4Centre for Transplant and Renal Research, Westmead Hospital, Sydney
Aims: We aimed to compare stroke deaths in kidney transplant recipients with the general population.
Methods: We established the primary cause of death for incident kidney transplant recipients using data linkage between the Australian and New Zealand Dialysis and Transplant Registry (ANZDATA) and national death registries: Australia, 1980-2013 and New Zealand, 1988-2012. We used indirect standardisation to estimate standardised mortality ratios (SMR) with 95% confidence intervals (CI) and a competing risks regression model to identify risk factors for stroke and non-stroke mortality.
Results: Among 17,621 kidney transplant recipients, there were 158 stroke deaths and 5,126 non-stroke deaths in 160,332 person-years of follow-up. Stroke death rates steadily increased from transplantation. All-cause stroke SMR were higher in people who were younger and particularly in females (Fig. 1). Kidney transplant recipients aged 30-49 had much greater stroke deaths than expected in the general population (Females: SMR 21.3, 95% CI: 13.9-32.7; Males SMR 9.9, 95% CI: 6.2-15.9). A higher risk of stroke death was associated with older age at transplant, earlier year of transplant and prior known cerebrovascular disease.
Conclusion: Stroke mortality is significantly higher among kidney transplant recipients than in the general population, particularly for young people and females. Cardiovascular risk factor control and acute stroke interventions have reduced stroke mortality in the general population, but their effectiveness and the extent to which they are used in kidney recipients is less clear.
THE ASSOCIATION BETWEEN ETHNICTY, ALLOGRAFT FAILURE AND MORTALITY AFTER KIDNEY TRANSPLANTATION IN INDIGENOUS AND NON-INDIGENOUS AUSTRALIANS: IS THIS EXPLAINED BY ACUTE REJECTION?
HOWSON Prue1, IRISH Ashley2, D'ORSOGNA Lloyd3,4, SWAMINATHAN Ramyasuda2, PERRY Gregory5, DE SANTIS Dianne3, WONG Germaine6,7,8, and LIM Wai H1,4
1Department of Renal Medicine, Sir Charles Gairdner Hospital, Perth, 2Department of Renal Medicine, Fiona Stanley Hospital, Perth 3Department of Immunology, Fiona Stanley Hospital, Perth, 4School of Medicine, University of Western Australia, Perth, 5Department of Renal Medicine, Royal Perth Hospital, 6University of Sydney, 7Centre for Transplant and Renal Research, Westmead Hospital, Sydney, 8Centre for Kidney Research, The Children's Hospital at Westmead, Sydney
Aim: We aimed to determine whether acute rejection (AR) was a mediator between ethnicity (Indigenous/Non-Indigenous), allograft failure and mortality after kidney transplantation and whether ethnicity was a risk factor for allograft failure and mortality in those who had experienced AR.
Materials and Methods: End-stage kidney disease patients who have received a kidney-only transplant between 2000-2010 in Western Australia were included. Cox proportional modelling was used to determine the association between ethnicity, AR, allograft failure and all-cause mortality. Mediation analysis was conducted to determine whether AR was a causal intermediate between ethnicity and outcomes, and propensity-scored analysis was used to examine the association between ethnicity and outcomes in recipients who had experienced AR.
Results and Discussion: Of 618 patients who received a kidney transplant, 59(9.5%) were indigenous. During a median(IQR) patient follow-up of time of 7.9(5.7) years, indigenous recipients were more likely to experience AR (73%vs.42%,p<0.001), allograft failure (66%vs.37%,p<0.001) or death (44%vs.25%,p=0.002) compared to non-indigenous recipients, with adjusted hazard ratios (HR) of 1.86(95%CI 1.28-2.70,p<0.001), 2.17(1.97-4.00,p<0.001) and 2.35(1.49-3.71,p<0.001), respectively. Approximately 29% and 2% of the effects between ethnicity and allograft failure and death, respectively were explained by AR. In the propensity-scored analysis in recipients who had experienced AR (1:1 ratio of indigenous/non-indigenous recipients matched for recipient age, donor type, HLA-mismatches and diabetes), indigenous recipients remained at a higher risk of allograft failure and death with respective adjusted HR of 1.88(1.09-3.25,p=0.023) and 2.18(1.03-4.60,p=0.041).
Conclusions: Acute rejection explained almost 30% of the association between ethnicity and allograft failure. Following rejection, the risk of mortality was 2-fold greater in indigenous compared to non-indigenous recipients. A greater understanding of factors contributing to these adverse outcomes is urgently required.
REVIEW OF THE NEW ZEALAND (NZ) EXPERIENCE WITH DONATION AFTER CIRCULATORY DEATH (DCD) KIDNEY TRANSPLANTATIONS 2008-2016
SUN Tina1, DITTMER Ian2, and MATHESON Philip3
1Department of Renal Medicine, Middlemore Hospital, 2Auckland Renal Transplant Group, Auckland City Hospital, 3Department of Renal Medicine, Wellington Hospital
Aim: Review of the NZ national experience and outcomes with DCD kidney transplantations since its introduction in 2008.
Background: Deceased donor kidney donation in NZ has been exclusively from donation after brain death donors for many years. This changed in 2008 with the introduction of DCD transplantations to increase the availability of deceased donors.
Method: A retrospective review of DCD kidney transplantations performed in NZ between January 2008 and December 2016, with ollow-up until March 2017. Patients were identified from ANZDATA registry and Organ Donation New Zealand database. Data collected were: age, gender, ethnicity, mortality, immediate and long-term graft function, cold ischaemic time, graft number and co-morbidities.
Results: A total of 42 DCD transplantations were conducted in NZ during the study period from 22 donors. The majority of the recipients were male (71%) with a mean age of 50.1 (+/-14.4). 57% of the recipients developed delayed graft function (DGF) requiring renal replacement therapy for a mean duration of 7.25 (+/- 5.7) days after transplantation. There was no primary graft non-function. All-cause graft survival was 90% at 1 year, 86% at 2 years, and 86% at 5 years. Death-censored graft survival was 100% at 1 year, 95% at 2 years, and 95% at 5 years. Mean creatinine were 181umol/L, 140umol/L, 139umol/L, 132umol/L, and 150umol/L at 1 month, 3 months, 6 months, 1 year, and 5 years after transplantation respectively.
Conclusion: DCD kidney transplantations in NZ had favourable long-term graft survival with good renal function despite high DGF in the initial post-transplantation period.
LONG-TERM OUTCOME OF KIDNEY TRANSPLANTATION IN PATIENTS WITH CONGENITAL ANOMALIES OF THE KIDNEY & URINARY TRACT
MCKAY Ashlene1, KIM Siah1,2, and KENNEDY Sean1,2
1Department of Nephrology, Sydney Children's Hospital, 2School of Women's & Children's Health, University of New South Wales, Sydney
Aim: Congenital anomalies of the kidney and urinary tract (CAKUT) are a leading cause of end stage kidney failure in the young. However, there is limited information on long term outcomes after kidney transplantation in this group. We explored the outcomes of kidney transplant in patients with the 3 most common severe forms of CAKUT; posterior urethral valves (PUV), reflux nephropathy and renal hypoplasia/dysplasia.
Methods: Data were extracted from ANZDATA on all first kidney transplants performed between 1976 and 2015 in recipients with a primary diagnosis of PUV, reflux nephropathy or renal dysplasia, who were younger than 30 years when they received their transplant. Using multivariate Cox regression, we compared death censored graft survival between the three groups.
Results: 142 patients with PUV, 272 with renal dysplasia and 938 with reflux nephropathy were included.10-year graft survival in PUV, renal dysplasia and reflux nephropathy was 67%, 72% and 64% respectively and 20-year graft survival was 32%, 51% and 43%.
After adjusting for age at transplant, era of transplantation, graft source and HLA matching, there was no significant difference in graft survival, although there was a trend to poorer outcome in PUV (HR 1.31, 95% CI 0.93 to 1.84).
Conclusions: Graft survival of first transplant in CAKUT is favourable at 10 years. We report a trend towards poorer graft survival for patients with PUV. Larger studies are required to determine whether the risk of graft failure is increased in patients with PUV.
COMPARISON OF KIDNEY ALLOGRAFT SURVIVAL IN THE EUROTRANSPLANT REGION AFTER CHANGING THE ALLOCATION CRITERIA IN 2010 – A SINGLE CENTER EXPERIENCE
MEHDORN Anne-Sophie1, BECKER Felix1, REUTER Stefan2, SUWELACK Barbara3, SENNINGER Norbert1, VOGEL Thomas1, PALMES Daniel3, and BAHDE Ralf3
1General, Visceral and Transplant Surgery, Universityhospital Muenster, Germany, 2Department of Nephrology, Universityhospital Muenster, Germany, 3Universityhospital Muenster, Germany, Universityhospital Muenster, Germany
In 2010 Eurotransplant introduced the European Senior Program (ESP) aiming to avoid waiting list competition between young and elderly patients suffering from end stage renal disease and thus shorten waiting times for both groups. ESP-donors have to be older than 65 years and grafts are preferably allocated regional in order to shorten cold ischemia time not primarily taking HLA matching into account. This study aims to compare a historic cohort with a collectiv receiving grafts according to new guidelines.
We stratified 159 eligible patients > 65 years (ESP (n=69), former allocation criteria (n=89)) from the transplant center of Muenster, Germany and analyzed patient and graft survival as well as surrogate markers of short- and long term graft function (acute rejection, primary function (PF), delayed graft function (DGF), glomerula filtration rate (GFR).
While donors were comparable in both groups, recipients in the ESP-group were significantly older (69.51 y ± 3.42 vs. 67.06 y ± 2,59, p < 0.05), had significantly shorter time of dialysis (13.64 m ± 20.06 vs. 60.17 m ± 28.06, p < 0.05) and suffered from more comorbidities. Cold and warm ischemia time were significantly reduced in the ESP-group and the latter had more grafts with PF. Longterm graft function was similar. Yet, graft survival was significantly better in the ESP-group. Overall patient survival was comparable after five years.
Patients receiving grafts from older donors according to the new ESP-criteria did not have disadvantages compared to patients receiving grafts according to the former allocation criteria.
DONATION AFTER CIRCULATORY DEATH COMPARED WITH DONATION AFTER BRAIN DEATH: OUTCOMES FOR ISLET TRANSPLANTATION IN AUSTRALIA
HAWTHORNE Wayne1,2, CHEW YiVee2, WILLIAMS Lindy2, HARON Christian2, HITOS Kerry1, MARIANA Lina3, KAY Tom3, O'CONNELL Philip2,4, and LOUDOVARIS Tom3
1Sydney Medical School, University of Sydney, Westmead Hospital, Sydney, 2Centre for Transplant and Renal Research, The Westmead Institute of Medical Research, 3Tom Mandel Islet Transplant Program, St Vincent's Institute, Melbourne, 4Western Clinical School, University of Sydney
Introduction: Islet cell transplantation provides long-term insulin independence treating T1D patient’s severe hypoglycaemic unawareness. Significant lack of organ donors results in patients remaining on the waitlist for years. Donation after Circulatory Death (DCD) donors may be a potential resource that could help solve this shortage.
Materials and Methods: Islet donor pancreata were compared from the Australian National Islet Transplant program with multiple donor and isolation variable outcomes analysed.
Results: A total of 27 DCD and 73 DBD islet donor pancreata were compared with no significant differences seen in donor characteristics between DCD and DBD. Isolation outcomes showed post-purification yield (IEQ) was significantly lower in DCD group (146,518±28,971) compared to DBD (256,986±17,652; P=0.001). Post-purification yield gm/pancreas was significantly lower in DCD (2,154±504 vs. 2,681±372 IEQ/g, P<0.0001). The quality and functionality of DCD and DBD islets were also significantly different in terms of the viability (%) – P=0.017 (higher in DBD than DCD), purity (%) – P=0.001 (higher in DBD than DCD). The proportion of DCD islets transplanted (1/27) was significantly lower than DBD (29/73) going to transplant (OR, 0.1093; 95% CI; P=0.001).
Conclusion: In the Australian setting with vast distances to ship pancreata we have had poorer outcomes from DCD pancreata for islet isolation and have thus far not yielded outcomes comparable to those from our DBD donors. Earlier intervention, the use of ante mortem heparin and faster logistics in transport may not only improve the DCD organs for transplantation but also help alleviate donor shortages allowing treatment of those with T1DM and severe hypoglycaemic unawareness.
IMMUNOSUPPRESSANT PRESCRIBING PRACTICES IN YOUNGER ADULTS COMPARED TO ELDERLY RENAL TRANSPLANT RECIPIENTS ACROSS AUSTRALIA AND NEW ZEALAND
COSSART Amelia1, COTTRELL Neil1, MCSTEA Megan2, ISBEL Nicole3, CAMPBELL Scott3, and STAATZ Christine3
1School of Pharmacy, University of Queensland, Brisbane, 2University of Queensland, Brisbane, 3Department of Nephrology, University of Queensland at the Princess Alexandra Hospital
Background: Kidney transplantation is first-line treatment for patients with end-stage renal failure. Optimising immunosuppressant regimens is crucial; current guidelines make no specific recommendations for elderly patients. Aim
Aim: To evaluate the immunosuppressant medicine prescribing differences of elderly and younger adult renal transplant recipients across Australia and New Zealand.
Methods: A descriptive study of data obtained from the ANZDATA (Australia and New Zealand dialysis and transplant) registry including all patients transplanted from 2000-2015 was conducted. Patients were categorised according to age: younger adults (<70 years) and elderly (>70 years). The types and doses of immunosuppressant medicines prescribed were compared between groups using descriptive statistics (Mann-Whitney test or chi-statistic, as appropriate).
Results: A total of 6,930 patients were included in the analysis; 39% of younger adults and 41% of elderly patients were female, with an average age of 48 and 72 years respectively. The three most commonly prescribed immunosuppressant drugs were prednisolone, mycophenolate and tacrolimus; with 87% of younger adults and 89% of elderly patients taking three immunosuppressant medicines. Initial doses of mycophenolate and tacrolimus were significantly lower in elderly patients (p<0.05), and this trend continued at one-year, with doses of mycophenolate, tacrolimus, cyclosporin A and azathioprine significantly lower in elderly recipients (p<0.05; Figure 1). Elderly patients also had greater median reductions from initial to one-year post transplant in their doses of mycophenolate and azathioprine (p<0.05).
Conclusions: In our sample, immunosuppressant medicine doses were reduced more in elderly patients. Further investigation of drug levels and patient outcomes in the elderly is warranted.
RENAL TRANSPLANT PATIENT AND GRAFT SURVIVAL UNAFFECTED BY POST-TRANSPLANT DIABETES IN THE ERA OF LOW MAINTENANCE IMMUNOSUPPRESSION
PIMENTEL AL1,2, MASTERSON R2, YATES C3,4, HUGHES P2, and COHNEY S5,6,7
1Graduate Program in Endocrinology, Universidade Federal do Rio Grande do Sul (UFRGS), 2Department of Nephrology, Melbourne Health, 3Department of Diabetes and Endocrinology, Melbourne Health, 4Department of Endocrinology, Western Health, 5Department of Nephrology, Western Health, 6Department of Medicine, University of Melbourne, 7Department of Epidemiology, Monash University, Melbourne
Aims: Preexisting diabetes (PEDM) and newly detected diabetes after transplant (PTDM) have been associated with reduced patient and graft survival. However, outcome data since adoption of lower maintenance immunosuppression is sparse. This study examined outcomes in patients undergoing renal transplantation between December 2004 and 2009 according to diabetes status, with patients receiving prednisolone ≤ 5mg, MMF ≤ 500mg b.d, and tacrolimus ≤ 4ng/ml beyond 12 months.
Methods: All patients transplanted between December 2004-2009 were analyzed using prospectively collected data from an electronic database, patient records and ANZDATA. Diabetes status was determined using HbA1c, blood glucose levels and/or use of glucose lowering therapy.
Results: 534 patients were assessed, 7 receiving more than 1 KT. Mean age 45.2±14.1 years, 64.6% male, 63 PEDM, 86 PTDM (64 diagnosed within 12 months, 22 subsequently). After mean follow-up of 9.2±2.2 years patient survival was 89.9%, 81% and 90.6%, respectively, in NDM, PEDM & PTDM diagnosed within the first year. When considering PTDM diagnosed anytime, patient survival was 87.2%. Mean Tac level at 1 year was 3.7±2.3 ng/mL and <4 ng/mL at 4 years. Graft survival was 70.5%, 69.8% and 73.3%, respectively, in non-DM, PEDM & those with PTDM diagnosed within 12 months, and 76.6 % when considering patients diagnosed with PTDM at any time (Figure 1).
Conclusions: In this large single centre analysis of renal transplant recipients receiving more contemporary immunosuppression, PTDM had no impact on patient or graft survival, though there was a statistically significant reduction on patient survival in patients with PEDM.
RANGE AND CONSISTENCY OF CARDIOVASCULAR OUTCOMES REPORTED IN CONTEMPORARY RANDOMISED TRIALS IN KIDNEY TRANSPLANT PATIENTS: A SYSTEMATIC REVIEW
VAN Kim Linh1,2, O'LONE Emma1,2, TONG Allison1,2, VIECELLI Andrea3, HOWELL Martin1,2, SAUTENET Benedicte4, MANERA Karine1,2, and CRAIG Jonathan1,2
1Centre for Kidney Research, The Children's Hospital at Westmead, Sydney, 2School of Public Health, University of Sydney, 3University of Queensland, Brisbane, 4University of Tours
Background: Cardiovascular disease (CVD) is the primary cause of death and a major contributor to graft loss in kidney transplant recipients. However, inconsistent reporting of cardiovascular outcomes may limit assessment of the comparative effect of interventions across trials and the use of trial evidence in decision-making.
Aims: To determine the scope and consistency of cardiovascular outcomes reported in contemporary trials in kidney transplant recipients.
Methods: MEDLINE, Embase, the Cochrane Kidney and Transplant Specialized Register , and ClinicalTrials.gov were searched from 2013 and 2017 to identify randomis ed trials and trial protocols reporting any cardiovascular outcome. Definitions, measures and timepoints for all CVD outcomes were extracted and analysed.
Results: From 81 trials, 1097 CVD different measures were extracted and categorised into 37 CVD outcomes. The three most frequently reported outcomes were: cardiovascular composites (35 [43%] trials), all-cause mortality (29 [36%] trials), and acute coronary syndrome (28 [35%] trials). Cardiovascular composites were reported in 33 different combinations of components, with 29 being unique to a single trial.
Conclusions: There is extreme heterogeneity in the reporting of cardiovascular outcomes in trials in kidney transplant patients. CVD composite outcomes vary widely. Establishing a standardized CVD outcome, that is critically important to patients and clinicians, may improve the relevance and use of trials to inform decision-making.
DO INDIGENOUS PATIENTS HAVE BETTER SURVIVAL WITH A KIDNEY TRANSPLANT COMPARED TO STAYING ON DIALYSIS? A PROPENSITY MATCHED STUDY
LAWTON Paul1, CUNNINGHAM Joan1, ZHAO Yuejen2, JOSE Matthew3, and CASS Alan1
1Wellbeing & Preventable Chronic Diseases Division, Menzies School of Health Research, Charles Darwin University, 2Innovation & Research Branch, Department of Health, Northern Territory Government, 3Department of Medicine, University of Tasmania
Aim: Indigenous patients are unlikely to be wait-listed for or receive a kidney transplant. Many clinicians are concerned about Indigenous transplant outcomes. We compared survival for Indigenous transplant patients with similar Indigenous dialysis-only patients, contrasting with non-Indigenous patients.
Methods: Using ANZDATA, all Australians commencing renal replacement therapy from 1st April 1995 were followed until 31st December 2015. Transplant recipients were paired by propensity score with similar dialysis-only patients of the same ethnicity within four time cohorts: time at risk for each pair was taken from the transplant date. All-cause survival was compared using unadjusted and stratified Cox proportional hazards models for three time periods post-transplant (accounting for non-proportional hazards), adjusted for demographic and clinical differences and a transplanted-remoteness interaction term.
Results: Indigenous dialysis-only patients were similar to their transplanted pair at baseline, but paired non-Indigenous patients were less similar. Unadjusted five year survival was better for transplanted patients than their dialysis-only pair for non-Indigenous (p<0.0001) and Indigenous patients (p=0.0005). Adjusted Cox models comparing transplanted with dialysis-only patients showed early (0-0.25 years post-transplant) survival equivalence for both Indigenous and non-Indigenous patients, with improvements in subsequent transplanted survival clearest for all non-Indigenous patients except from very remote areas, and for Indigenous patients in major cities (MC) and inner regional (IR) areas.
Conclusions: Indigenous transplanted patients have similar or better survival to similar dialysis-only patients, with long-term benefit in MC/IR. Relatively fewer apparently suitable Indigenous patients received transplants. These data provide direction for future targeted clinical and health services research.
Cells and Tissues/Donor Specific Antibodies
SURVIVAL AND FUNCTION OF HUMAN ADRENAL CELL IN IMMUNOISOLATION DEVICE IN ADRENALECTOMIZED IMMUNODEFICIENT MICE
CATTERALL T1, KRISHNA MURTHY B1, MARIANA L1, KOS C1, SACHITHANANDAN N2, THOMAS H1, LOUDOVARIS T1, and KAY T1
1Immunology & Diabetes, St Vincent's Institute, Melbourne, 2Department of Endocrine and Metabolism, St Vincent's Hospital, Melbourne
Background: Primary adrenal insufficiency (PAI) is caused by failure of the adrenal gland to produce steroid hormones - glucocorticoids and mineralocorticoids - and is a potentially lethal disease. Synthetic steroid hormones have transformed PAI from a lethal condition to a chronic one. However, management of PAI is still challenging for patients and clinicians as the current regimens do not restore or replicate normal cortisol secretion in normal conditions and during illness and stress. Hence with current treatment mortality is not normalized and quality of life is poor.
Aim: To treat PAI patients with adrenocortical cells in immunoisolation devices to restore physiological steroid hormone secretion without needing immunosuppression.
Method: We studied the survival and function of isolated human adrenal cells in vitro and in vivo in NRG - SCID mutated NOD mice. About 300x106 adrenocortical cells/ human adrenal gland are routinely obtained with > 80% cells viable, with survival and function in vitro for more than 14 days. A cohort of 10 immunodeficient mice were implanted with an immunoisolation devices into epididymal or ovarian fat pad to allow vascularisation to establish. After 4 weeks, mice underwent bilateral adrenalectomy and were transplanted with 5 million human adrenocortical cells into the device. One mouse died almost 4 weeks after adrenalectomy and the remaining mice are healthy and are being followed up for >10 weeks, secreting cortisol and responding to stimulation with synthetic ACTH 1-24 (synacthen).
Conclusion: Results indicate survival and function of human adrenal cells in the vascularised encapsulation device.
DEVELOPING PHOSPHOLIPASE A2 RECEPTOR ScFv FOR CAR TREGS FOR THE TREATMENT OF AUTOIMMUNE RENAL DISEASE
KARUNIA J1, WANG YM2, ZHANG GY2, WILARAS A2, BAKHTIAR M2, MCCARTHEY H2, and ALEXANDER SI1
1 Centre for Kidney Research, Children's Hospital at Westmead, 2Centre for Transplant and Renal Research, Westmead Institute for Medical Research 3Children's Hospital at Westmead
Background: Idiopathic membranous nephropathy (IMN) is a leading cause of autoimmune renal disease driven in many cases by the recently described cognate antigen M-type phospholipase A2 receptor (PLA2R) expressed on glomerular epithelium. Chimeric antigen receptors (CAR) T cells use antibody fragments to direct T cells to specific antigens, and have achieved clinical success in cancer. The strategy can be translated to treat idiopathic membranous nephropathy (IMN), an autoimmune condition that involves PLA2R, a target antigen that is exclusively expressed on the podocyte lining of the kidneys.
Aims: In this project, we aim to use PLA2R as a target antigen for treating IMN and design a single chain fragment of variable region (ScFv) to use in PLA2R-CAR-Tregs directed towards this antigen by generating a PLA2R-specific monoclonal antibody against this antigen on human, mouse and rat podocytes.
Method: By using genetic sequence search tools, the PLA2R amino acid sequence across three (3) species of human, mouse and rat, were aligned and compared to generate three common peptide immunogens. Using a conditionally-immortalized podocyte cell line (ciPod) we examined immunhistochemically for M-Type PLA2R expression on human podocytes in vitro as an assay for antibody testing. Mice were immunized with the PLA2R peptides to produce monoclonal antibodies against PLA2R. Hybridomas were established and screened and the hybridoma antibody sequenced for use in making the ScFv for the CAR construct.
Results: We have confirmed human expression of the M-type PLA2R in human podocytes on their cell membrane in vitro by immunohistochemical staining. The anti-PLA2R monoclonal antibody (mAb) has been detected in the mouse sera of immunized mice by Western Blot and ELISA. The mAb hybridoma is being sequenced. The anti-PLA2R mAb from these hybridomas is reactive for the human M-Type PLA2R.
Conclusion: We have developed hybridomas against a podocyte target antigen that is also a disease antigen in membranous nephritis and are developing this as a kidney targeting strategy.
COMPARISON OF PANCREATA AND ISLET PREPARATIONS FROM HUMAN ORGAN DONORS
MARIANA Lina, LOUDOVARIS Thomas, KOS Cameron, PAPAS Evan, SELCK Claudia, CATTERALL Tara, THOMAS Helen and KAY Thomas WH
Immunology & Diabetes, St Vincent's Institute, Melbourne
Background: In the past ten years we have received over 300 pancreata and most were processed into islets for either transplant and/or research. Forty-nine of the donors resulted in transplants (into 26 diabetic recipients, 5 as autotransplants), 49 were diabetic (T1D and T2D), 24 were from non-heart beating donors (DCD) and the remaining were heart beating brain dead (BD) donors. Many factors influence the outcome of isolating islets. Here we compare the characteristics of the donor, pancreas and islet preparations of transplantable isolations with isolations that failed to meet transplant criteria, including diabetic and DCD.
Methods: Islets were isolated based on the Ricordi Method. The Edmonton Score used indicates donor quality and incorporates age, CIT, BMI, cause of death, hospital stay, amylase/lipase, procuring team, medical history pancreas fat content, quality of flush and damage.
As shown in the table, the quantity of islets/g pancreas in diabetic donors was significantly less than non-diabetic donors both pre-purification and post-purification. Similar results were found with glucose stimulated insulin secretion among the group. T1D pancreata were not only deficient in islet numbers but their pancreas size was significantly smaller compared to the other groups.
While there was no difference in body weight between the two groups, T2D: 86.02±16 kg vs non-diabetic: 85.07±21 kg, the IEQ/kg body weight was significantly different, T2D: 1907±1410 vs non-diabetic: 4344±2577 (t-test p<0.0001).
Conclusion: The quality of donors in the transplant group was significantly higher that the other groups as measured by the Edmonton Score. T2D donors are insulin resistant and have islet function deficiencies. Our data show that fewer islets can be isolated from T2D than non-diabetic donors, further validating their exclusion as an islet transplant donor.
MACHINE LEARNING PREDICTION FOR DE NOVO DONOR SPECIFIC ANTIBODIES (DNDSA) AND GRAFT LOSS IN SIMULTANEOUS KIDNEY PANCREAS TRANSPLANT (SPK) RECIPIENTS
COOREY Craig1,2, SHARMA Ankit1,2, CHAPMAN Jeremy3, CRAIG Jonathan1,2, O'CONNELL Philip3, LIM Wai4, NANKIVELL Brian3, TAVERNITI Anne2, WONG Germaine1,2,3, and YANG Jean5,6
1School of Public Health, University of Sydney, 2Centre for Kidney Research, The Children's Hospital at Westmead, Sydney, 3Centre for Transplant and Renal Research, Westmead Institute for Medical Research, Westmead Hospital, Sydney, 4Department of Renal Medicine, Sir Charles Gairdner Hospital, Perth, 5School of Mathematics and Statistics, University of Sydney, 6Charles Perkins Centre, University of Sydney
Aim: To develop a prediction model for dnDSA and allograft loss based on the location of eplet mismatches in SPK transplant recipients.
Methods: A total of 198 SPK transplant recipients (1990-2017) were assessed using data from ANZDATA registry and National Organ Matching System. Machine learning models (random forests) were used to predict dnDSA and allograft loss based on the location of eplet mismatches. The sites of the three most important eplet mismatches were determined using ‘mean decrease in accuracy’.
Results: The cohort included 111 (56%) males, mean age: 38.5 years (SD 6.9) and median follow up time of 6.6 years (IQR: 3.9,11.0). The most common Class I and II eplet mismatches were at 156RA (35%), 82LR (35%) and 76EN (33%); and 70D (55%), 56PD (54%) and 67I (52%), respectively. A total of 38 (20%) and 56 (32%) recipients developed Class I and II dnDSA and 14 (7%) and 29 (15%) patients experienced kidney and pancreas graft loss. Random forest model with the location of eplet mismatches as features achieved a mean cross-validation error of 47.6% and 49.8% for Class I and II dnDSA, 52.3% and 49.1% for kidney and pancreas allograft losses (Table 1). For dnDSA prediction, the three most important Class I eplet mismatches are present in the HLA A antigens, while for Class II eplet mismatches, only DQB1*03 is implicated.
Conclusions: The location of the most important eplet mismatches for prediction differed between dnDSA and allograft loss, but random forest model performance was largely indistinguishable.
HLA EPLET MISMATCH AND DONOR SPECIFIC ANTIBODIES IN KIDNEY TRANSPLANTATION
WAN Susan1,2, ANGEL DE WILDE Sian2, ROSALES Brenda3, CHADBAN Steven1,4, and WYBURN Kate1,2
1Department of Renal Medicine, Royal Prince Alfred Hospital, Sydney, 2Sydney Medical School, University of Sydney, 3School of Public Health, University of Sydney, 4Other, University of Sydney
Background: Eplet mismatch provides higher resolution information than HLA and has potential to better predict alloimmune events, including donor specific antibodies (DSA). However, limited prospective data exists on the association between eplet mismatch and DSA.
Aim: To determine the relationship between eplet mismatch and DSA.
Methods: We characterised the number of HLA-A, B, C, DR and DQ eplet mismatches in kidney transplant recipients from 2010-2017 using HLA Matchmaker. Molecular HLA typing was converted from low-resolution (2-digit) to high-resolution (4-digit) using HLA Matchmaker Converter where necessary. All patients were prospectively screened for DSA at 0, 3 and 12-months post-transplant. Associations between eplet mismatches, pretransplant DSA (preDSA), denovo DSA (dnDSA) and clinical outcomes were assessed using multivariable analysis.
Results: Of 313 recipients, high-resolution HLA conversion was not possible for 147 (47%) due to the absence of ethnicity (n=83) or haplotype (n=64) data in the Converter database. Eplet mismatch determination was therefore possible for 166 donor-recipient pairs, of whom DSA screening was complete for 150. The mean number of Class I and II eplet mismatches was 14(±7.7) and 17(±11.8) respectively (Table 1). DSA was detected in 111 recipients (74%); 64 (43%) had preDSA, 30 (20%) had dnDSA, and 17 (11%) had both. The number of eplet mismatches was associated with preDSA (OR 1.04; 95% CI 1.01-1.07; P=0.007), but not with dnDSA or acute rejection.
Conclusion: Calculated eplet mismatches were not predictive of dnDSA development or acute rejection, raising doubt about the utility of HLA Matchmaker based eplet matching to predict post-transplant alloimmune events.
DONOR SPECIFIC ANTIBODIES AND CLINICAL OUTCOMES IN KIDNEY TRANSPLANT RECIPIENTS
WAN Susan1,2, CHADBAN Steven1,2, and WYBURN Kate1,3
1Department of Renal Medicine, Royal Prince Alfred Hospital, Sydney, 2Sydney Medical School, University of Sydney, 3Other, University of Sydney
Background: Donor specific antibodies (DSA) are implicated in acute rejection (AR) and graft dysfunction in kidney transplant recipients (KTx). However, limited data exists on their natural history post-transplantation.
Aims: To describe the natural history of pre-transplant (preDSA) and denovo DSA (dnDSA) in KTx.
Methods: We performed a prospective single-centre cohort study in KTx. Patients were screened for DSA at 0, 3 and 12-months post-transplant, and associations between DSA and outcomes were assessed.
Results: 363 KTx between 2010-2017 underwent pre- and post-transplant DSA screening. 136(37%) had preDSA at transplantation; 86(63%) had ClassI and 89(65%) had ClassII. The median MFI of the dominant preDSA was 1230 (IQR 746-2528) at transplantation and declined rapidly, becoming undetectable by 1-month post-transplant. DnDSA were detected in 62(17%) recipients; 28(45%) had ClassI and 45(73%) had ClassII. The median time to first detection of dnDSA was 58 days (IQR 15-267) and the median MFI of the dominant dnDSA was 1150 (IQR 707-2694) at first detection. The MFI of the dominant dnDSA increased over time reaching a median of 14,011 (IQR 931-20,626) at 2 years (Figure 1). 58(26%) of 220 patients with ≥2-years follow-up developed AR; 42(19%) had cell-mediated rejection, 20(9%) had antibody-mediated rejection. The development of dnDSA was strongly associated with AR (OR 4.48; 95%CI 2.14-9.36; P<0.001) but not with eGFR or graft-survival.
Conclusion: PreDSA were present in 37% of KTx and were significantly reduced by 1-month post-transplant. dnDSA were detected in 17% and increased in intensity over time. The development of dnDSA was strongly associated with AR.
RISK STRATIFICATION FOR REJECTION BY EPLET MISMATCH AFTER EARLY MYCOPHENOLATE DOSE REDUCTION IN KIDNEY TRANSPLANT RECIPIENTS
COUGHLAN Timothy1, CANTWELL Linda2, and LEE Darren1
1Department of Renal Medicine, Eastern Health, 2Victorian Transplantation and Immunogenetics Service, Australian Red Cross Blood Service
Aims: Eplet mismatch (EpMM) is associated with de novo donor-specific antibodies and long-term graft loss in kidney transplant recipients (KTR) especially with poor adherence and low tacrolimus levels. We investigated whether EpMM predicts rejection post-mycophenolate dose reduction within the first year.
Methods: Data on KTR receiving tacrolimus, mycophenolate and prednisolone in a single centre (February 2011 – January 2017) was retrospectively analysed, excluding those with rejection within the first month. We explored the association of conventional HLA mismatches, EpMM (HLAMatchmaker, antibody verified and unverified) and mycophenolate dosing with acute rejection in those with early mycophenolate reduction (<1.5g/d within the first year).
Results: Of the 63 eligible patients (median follow-up: 3.1 years, 12 months minimum), 44 had early mycophenolate reduction, mostly due to cytopaenia (68%). There was no difference in rejection rates with or without early dose reduction (27% vs 28%). Within the dose-reduced cohort, there was no significant difference in conventional HLA mismatches (4.3±1.8 vs 3.3±1.9, p=0.12), or total (51±27 vs 55±35, p=0.73), class I (17±8 vs 15±8, p=0.35), class II (34±22 vs 30±29, p=0.51) or HLA-DQ (16±15 vs 21±18, p=0.74) EpMM loads between rejectors and non-rejectors. No difference in rejection rates was observed between those with >17 (n=22) vs ≤17 (n=22) HLA-DQ EpMM (32% vs 23%, p=0.73). There was also no significant difference in the duration of mycophenolate dosing <1.5g/d and <1.0g/d, or the nadir dose between rejectors and non-rejectors.
Conclusions: EpMM was not associated with acute rejection in KTR with early mycophenolate dose reduction within the first year.
IN VIVO DEPLETION OF REACTIVE DONOR HUMAN CELLS REDUCES THE DEVELOPMENT OF GRAFT-VERSUS-HOST DISEASE IN A HUMANISED MOUSE MODEL
ADHIKARY Sam, GERAGHTY Nicholas, SLUYTER Ronald, and WATSON Debbie
Illawarra Health and Medical Research Institute, University of Wollongong
Graft-versus-host disease (GVHD) is a common life threatening consequence following allogeneic donor bone marrow transplantation. Reactive donor cells are the main effectors of GVHD, and depletion of these cells reduces GVHD severity in allogeneic mouse models, but there is limited data in the humanised mouse models.
Aim: This study aimed to investigate the effect of depleting reactive donor human cells on the development of GVHD in a humanised mouse model.
Methods: NOD-SCID-IL2Rγnull (NSG) mice were injected (i.p.) with 20x106 human (h) peripheral blood mononuclear cells (PBMCs) to induce GVHD, and subsequently injected with post-transplant cyclophosphamide (PTCy) (33mg/kg), or saline on days 3 and 4 post-hPBMC injection. Mice were monitored for GVHD development for 10 weeks, with human cell engraftment examined at 3 weeks post-hPBMC injection, and at end-point by flow cytometry.
Results: PTCy did not affect the engraftment of human cells in NSG mice at 3 weeks post-hPBMC injection. PTCy lowered the development of GVHD in humanised mice, significantly reducing weight loss (P=0.0447) and GVHD clinical score (P=0.0478), and increasing survival (MST=52 days) compared to saline-injected mice (MST=28 days) (P=0.0004). Additionally, PTCy significantly increased the proportion of hCD4+ T cells, and significantly lowered the proportions of hCD8+ T cells and hCD4+hCD25+hCD127lo regulatory T cells. Finally, PTCy did not effect relative expression of pro-inflammatory cytokines, including hIFN-γ and hIL-17, in the liver, spleen or small-intestine of humanised mice.
Conclusion: Depletion of reactive human cells reduces GVHD development in this humanised mouse model, supporting its use in future studies investigating depletion strategies against GVHD.
VALIDATION OF THE ONE LAMBDA FLOWDSA™ ASSAY FOR LIVING DONOR TRANSPLANT WORKUP
BAZELY Scott, TASSONE Gabriella, D'ORSOGNA Lloyd, MARTINEZ Patricia and DE SANTIS Dianne
Clinical Immunology, PathWest, Fiona Stanley Hospital, Perth
Introduction: Routine flow cytometric crossmatches (FCXM) detect donor specific HLA IgG-alloantibodies in transplant recipients. One constraint of FCXMs is false positives caused by non-HLA-mediated cytotoxicity caused by autoantibodies, non-HLA antibodies and immune complexes, and other interfering factors including treatment regimens in desensitisation protocols (e.g. rituximab), that do not reflect patient transplant outcomes. The new One Lambda FlowDSA™ assay specifically labels recipient IgG-alloantibodies bound on the donor cell surface thereby distinguishing alloantibodies from autoantibodies. In this kit, HLA molecules are separated into three groups: Class I, Class IIa (DQ), and Class IIb (DR, DP). The aim of this validation was to determine whether this assay could overcome the limitations of the current flow crossmatch assays. The validation of FlowDSA™ assay required compensation to correct for spectral overlap in fluorochromes, and the establishment of cut-offs to define the crossmatch interpretation.
Methods: FlowDSA™ compensation occurred using a bead-only control and an HLA positive control to correct PE and PerCP spectral overlap and separate the Class I, Class IIa and Class IIb populations. Positive serum with known donor specific antibodies (DSA) and negative serum was evaluated using the FlowDSA™ assay and compared results obtained with current methods.
Results: All three HLA molecule groups were distinguished as separate populations. The FlowDSA™ assay reported a positive crossmatch in the presence of strong DSA and a negative crossmatch in the absence of DSA.
Conclusions: The FlowDSA™ assay appears to be an alternative to current flow crossmatch methods. Suitable positive and negative crossmatch cut-offs, the ability to detect weak DSA and the rate of false positives due to interference from non-HLA factors including Rituximab will be important to determine its suitability for routine clinical use.
IN VITRO SCREENING OF GENES ASSOCIATED WITH KIDNEY FIBROSIS
MA Xiaoqian1,2, SUN Lei1, LU CAO1,2, YI Shounan1, and O'CONNELL Philip1
1Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 2Institute for cell transplantation and gene therapy, The Third Xiangya Hospital of Central South University
Aims: Chronic injury in kidney transplants remains a major cause of allograft loss. Our GoCAR multicenter study has identified a set of 13 genes were independently predictive for the development of fibrosis at 1 year after kidney transplantation. The high predictive capacity of the gene set was superior to clinical indicators. The aim of this study was to identify one or few of these genes which are associated with pathogenesis of kidney fibrosis.
Methods: The murine C1.1 tubular epithelial cell line, FOXO−/− C1.1 and TCF−/− C1.1 cell line were treated with or without TGF-βfor 48h. Then the cells were harvest for real-time PCR to detect the expression of the 13 genes.
Results: We found there were big changes for four genes’ expression. FJX1 and KLHL13 were low expressed in C1.1 and FOXO−/− but upregulated when cells treated with TGF-β. Especially in FOXO−/−, TGF-βinduced more than 10 times expression of them which suggested FJX1 and KLHL13 may play important role in profibrotic effect. The expression of CHCHD10 was opposite with FJX1 and KLHL13 in FOXO−/− and TCF−/−. It was downregulated in FOXO− while upregulated in TCF−/− when treated with TGF-β. The expression of ASB15 was almost undetectable in C1.1 and FOXO−/− no matter with or without TGF-β but more than 100 folds in TCF−/−.
Conclusions: The results suggested the four genes may be involved with the signaling pathway of fibrosis and we will further confirm their function by CRISP/CAS9 technique.
IMMUNE PHENOTYPE BY FLOW CYTOMETRY OF PEDIATRIC KIDNEY TRANSPLANT RECIPIENTS AND HEALTHY ADULT CONTROLS
JIMENEZ-VERA Elvira1, ZHAO YUANFEI1, HU Min1, CHEW Yi Vee1, BURNS Heather1, ANDERSON Patricia2, WILLIAMS Lindy1, DERVISH Suat3, WANG Xin Maggie3, YI Shounan1, HAWTHORNE Wayne1, ALEXANDER Stephen4, and O'CONNELL Philip1
1Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 2Centre for Transplant and Renal Research, Westmead Hospital, Sydney, 3Flow Cytometry Core Facility, Westmead Millennium Institute, Westmead Hospital, Sydney, 4Centre for Kidney Research, The Children's Hospital at Westmead, Sydney
Aim: To determine the differences in the immunophenotype of pediatric kidney transplant recipients and healthy adult controls.
Methods: Seven leukocyte-profiling panels containing 8–10 marker for each- were used to monitor the immune profiles of 9 paediatric kidney transplant and 8 adult control samples. Whole-blood (1.5 ml) samples were stained and acquired on BD-LSRFortessa and Flowjo was used for data analysis.
Results: Differences in subpopulations between pediatric and healthy adult control can be seen in Table 1. This showed a significant increase in absolute numbers of Granulocytes, CD14+Monocytes, double negative NKT cells and CD56+HICD16+ Intermediate NK cells. B cell panel showed a significant increase in naïve B cells, IgD+IgM+ B cells, and IgM+CD27- B cells. The naïve CD4+ and naïve CD8+ T cells, and naïve Tregs which were higher than in adult controls. Moreover, naïve Foxp3 Tregs in pediatric transplanted patients had higher CD25. Memory CD4+ T cells in pediatric patients has a similar HLA-DR expression. Further we found a significant decrease in the following cell populations in our pediatric kidney transplant patients: Non-classical monocytes, CD8+NKT, memory B cells, IgD-IgM- B cells, CD27+CD38low Class-switched memory, CD27-CD28+, CD25+ of CD4+, CD57+CD27-CD28+ and CD25+ of CD8, CCR7-/CD62L-CD45RA- of CD4 and CD8, CXCR3+CD45RO+ of CD4 and of CD8, and CD127+CD45RO+ and CD25+CD45RO+ on CD4 T cells (effector Foxp3 Tregs).
Conclusion: Immune profiling of pediatric transplant recipients demonstrated more naïve T cells, B cells and Tregs and less memory and effector memory T cells compared to healthy adult controls.
COMPARISION OF 3 LYMPHOCYTE SEPARATION METHODS FOR FLOW CROSSMATCH ASSAY
TASSONE Gabriella, BAZLEY Scott, D'ORSOGNA Lloyd, MARTINEZ Patricia and DE SANTIS Dianne
Clinical Immunology Fiona Stanley Hospital, Pathwest
Introduction: The Stem Cell EasySep Direct Total Human Lymphocyte Isolation kit™ (DTHLI) uses immunomagnetic beads technology to bind non-lymphocytes within the sample. The beads are then removed by a magnet, while the supernatant contains the lymphocytes. The Stem Cell SepMate gradient centrifugation tubes™ (SepMate) use a plastic insert to keep the Ficoll at the base of the tube for spinning and pouring off the lymphocyte layer.
Method: The SepMate, DTHLI, and the current routine Ficoll gradient isolation methods were compared. The isolation time, cell yield and suitability for the routine flow crossmatch assay (FCXM) were assessed.
Results: The SepMate and DTHLI methods were more rapid, 30 and 45 minutes respectively, compared to 3 hours for the current method. The cell yield obtained by the SepMate and the current method were sufficient to perform the FCXM on untreated and pronase treated serum, while the DTHLI provided sufficient cells to perform the FCXM only on the pronase treated serum. Despite the lower cell yield the purity of CD3 and CD19 was superior in the DTHLI isolation method compared to the other methods. The lymphocyte preparations were then evaluated using the FCXM using a serum known to have donor specific antibody (DSA) to donor mismatches and a negative serum.
Conclusion: The results indicated that both SepMate and DTHLI were more rapid than the current method. The cell yield of SepMate was comparable to the current method however the cell yield obtained from the DTHLI was lower than both the current method and SepMate. However, the DTHLI isolated a greater proportion of CD3 and CD19 positive cells and therefore the total number of lymphocytes required to perform FCXM may be less than currently required. All three methods produced comparable flow crossmatch results.
ENHANCED RECOVERY AFTER SURGERY AND THE RENAL TRANSPLANT RECIPIENT – USEFUL OR A WASTE OF TIME?
LAMBERT Virginia, CHANDRA Abhilash, RUSSELL Christine, OLAKKENGIL Santosh and BHATTACHARJYA Shantanu
Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital
Enhanced Recovery After Surgery (ERAS) pathways are an accepted part of modern surgical practice. In renal transplant recipients, however, there is no clear consensus regarding their utility.
The renal transplant unit at the Royal Adelaide Hospital introduced an ERAS protocol for the perioperative management of transplant recipients from June 2017. We present the outcomes of 39 consecutive cases.
Patient and Methods: All patients who had a renal transplant from introduction of the protocol until the time of writing were enrolled in the ERAS protocol. The protocol included pre-operative weight optimization on dialysis; perioperative carbohydrate loading; goal-directed fluid management prior to reperfusion; fluid balance, aiming to achieve a net weight gain ≤ 3kg in the first 24 hours; opiate avoidance and use of regional wound infusers.
There were nine recipients from a live donor; 23 from donation after brainstem death (DBD) and 7 from donation after circulatory death (DCD).
Results: The mean weight gain on post- operative day 1 was 3.06Kg.
None of the live donor recipients had delayed graft function (DGF) requiring dialysis. DGF was observed in 28.6% of the DCD graft recipients and 39.1% of the DBD graft recipients.
Mean length of stay was 4.6 days.
Conclusion: Our experience challenges the widespread practice of fluid loading post renal transplant. Our re-admission rate has not increased and early results suggest that there are significant gains to be made via reduced length of hospital stay.
TRANSITION FROM LAPAROSCOPY TO RETROPERITONEOSCOPY FOR LIVE DONOR NEPHRECTOMY - A CASE CONTROL STUDY
NG Zi Qin1, REA Alethea2, and HE Bulang1,3
1WA Liver & Kidney Transplant Service, Sir Charles Gairdner Hospital, Perth, 2Centre for Applied Statistics, University of Western Australia, Perth, 3Department of Surgery, University of Western Australia, Perth
Aims: Transperitoneal (TLDN) approach for laparoscopic donor nephrectomy (LDN) is widely adopted in most centres. However, a systematic review has shown that retroperitoneoscopic (RLDN) approach is associated with fewer complications due to the anatomical advantage by avoidance of manipulation of the intraperitoneal organs. The aims of this study were to compare the outcomes of RLDN and TLDN by a case control study. The learning curve for transition from TLDN to RLDN was analyzed.
Methods: A retrospective analysis of all LDNs from 2010 to Oct 2017 were performed. Data on demographics, peri-operative parameters, analgesia consumption, pain scores and kidney graft function were collected and analyzed. A CUSUM analysis was performed to explore the learning curve of RLDN by setting the mean operative goal time of TLDN as a target.
Results: All these 122 donor nephrectomies (60 TLDN and 62 RLDN) were successful with no conversion to open surgery. There was no blood transfusion, readmission or mortality. There were no post-operative complications which were graded over Clavien II. The kidney graft function was comparable in both groups. The follow-up period ranged from 4 to 90 months. The CUSUM analysis demonstrated that approximately 30 cases are required for the surgeon to be proficient in the transition from TLDN to RLDN.
Conclusions: RLDN is a safe approach with comparable results to TLDN. It avoids manipulating the intraperitoneal organs and retains a virgin abdomen and hence reduces peri-operative complication risk. The learning curve of transitioning from TLDN to RLDN is acceptable.
FAVOURABLE CARDIAC REMODELING AND FUNCTIONAL CARDIAC BENEFITS ASSESSED WITH CARDIAC MAGNETIC RESONANCE IMAGING FOLLOWING LIGATION OF ARTERIOVENOUS FISTULA IN STABLE RENAL TRANSPLANT RECIPIENTS: A RANDOMIZED, CONTROLLED, OPEN LABEL STUDY
RAO Nitesh1,2, MCDONALD Stephen3, WORTHLEY Matthew4, and COATES Patrick Toby5
1Nephrology and Renal Transplant, Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital, 2Lyell McEwin Hospital, 3Department of Nephrology, Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital, 4Department of Cardiology, Royal Adelaide Hospital, 5Centre for Transplant and Renal Research, Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital
Aim: To study the change in left ventricular mass(LVM) following ligation of arteriovenous fistula(AVF) in stable renal transplant recipients(RTR), utilizing cardiac MRI(CMR).
Methods: In this randomized controlled trial, we recruited participants aged 18 years or older with stable kidney function twelve months post kidney transplantation, and a functioning AVF, from a tertiary network of renal transplantation service in Australia. Participants were randomly assigned (1:1) to have AVF ligated or not, with all participants undergoing a baseline CMR followed by repeat scan six months later. The primary outcome was change in LVM at 6 months, analyzed according to intention-to-treat principles.
Results: We enrolled 93 participants. 63 eligible participants underwent randomization. 54 out of 63 participants completed assessments after second CMR. The mean LVM decreased by 22 gms (14.7%) [151.2 + 36.5 gm vs. 129.1 + 32.4 gm, p <0.001 in the intervention group vs 153.4 + 47.8 gm vs. 154.6 + 43.0 gm, p= 0.69 in the non-intervention group]. Significant improvements were also noted in end-diastolic volumes, end-systolic volumes, and stroke volumes of both left and right ventricles. There was also an improvement in the atrial volumes. No significant complications were noted after AVF ligation.
Conclusion: In this randomized controlled trial for adults with stable kidney transplantation and functioning AVF, elective ligation of AVF is associated with a 14.7% decrease in LVM as assessed by CMR. This was also associated with improvements in other cardiac parameters.
PROPHYLACTIC DRAIN INSERTION IN RENAL TRANSPLANTATION: SURGEON PREFERENCE ACROSS AUSTRALIA AND NEW ZEALAND
MUGINO Miho1, LAM Susanna1, LAURENCE Yuen2, VERRAN Deborah1, ALLEN Richard3, PLEASS Henry3, and LAURENCE Jerome1
1General, Visceral and Transplant Surgery, Royal Prince Alfred Hospital, Sydney, 2General, Visceral and Transplant Surgery, Westmead Hospital, Sydney, 3General, Visceral and Transplant Surgery, University of Sydney
Aims: There are no guidelines available concerning the use of prophylactic drain at the conclusion of renal transplantation (RT) in order to prevent post-operative complications such as lymphoceles. We aim to provide a summary of practice amongst renal transplant surgeons across Australia and New Zealand (ANZ).
Methods: An online survey for surgeons who routinely conduct RT across ANZ transplant centres was conducted to study respondents’ demographic information, surgical experience, preference regarding prophylactic drain insertion and their post-operative practice.
Results: 43 out of 66 identified surgeons completed the survey. 41.9% were general surgeons with subspecialisation in transplantation (18.6%) and hepatobiliary surgery (18.6%); 37.2% were vascular surgeons; 13.9% were urologists and 7% were transplantation and dialysis access surgeons.
60.5% of surgeons reported that they insert perigraft drain routinely whereas 20.9% seldom insert drains. The most common reason (58.1%) for drain insertion was “routine practice”. 30.2% of respondents were “uncertain about benefit of drain use” whereas 48.8% felt that this reduced symptomatic peritransplant fluid.
44.2% of respondents consider both volume and time as important factors for drain removal with less emphasis on the fluid composition. Mean post-operative day for drain removal was at 4.56 days. Some surgeons test drain creatinine to exclude urine leak (16.3%). 74.4% of surgeons would consider enrolling their patients for RCT to determine benefit of drain insertion.
Conclusion: There is a wide range of practices amongst RT surgeons. Individual surgeons’ experience appears to be the greatest factor in decision making.
AORTIC VERSUS DUAL PERFUSION FOR RETRIEVAL OF THE DBD LIVER – AN ANALYSIS OF RECIPIENT OUTCOMES USING THE ANZ LIVER TRANSPLANT REGISTRY
HAMEED Ahmer1,2, PANG Tony2,3, YOON Peter2, BALDERSON Glenda4, RONALD De Roo2, YUEN Lawrence2,3, LAURENCE Jerome5,3, LAM Vincent2,3, CRAWFORD Michael6, HAWTHORNE Wayne1,2, and PLEASS Henry2,3
1Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 2Department of Surgery, Westmead Hospital, Sydney, 3School of Medicine, University of Sydney, 4Australia and New Zealand Liver Transplant Registry, Princess Alexandra Hospital, Brisbane, 5Institute for Academic Surgery, Royal Prince Alfred Hospital, Sydney, 6Department of Surgery, Royal Prince Alfred Hospital, Sydney
Aims: To compare the impact of aortic-only and dual (aorta and portal vein) perfusion on donation after brain death (DBD) liver transplantation outcomes.
Methods: DBD liver transplants performed in Australia (2007–16) were included in analyses, and stratified by aortic or dual perfusion routes. The ANZLTR, ANZOD, and a national survey of senior donor surgeons were used to obtain all data-points. Only livers preserved in University of Wisconsin solution were included; patients receiving a subsequent liver transplant, or a reduced size graft were excluded. Graft and patient survival were compared using Kaplan-Meier curves and Cox proportional hazards. Causes of graft loss, including primary non-function, hepatic artery and portal vein thrombosis, biliary complications, and acute rejection, were compared using logistic regression.
Results: Aortic-only perfusion was utilized in 957 cases, compared to 425 dual-perfused livers. The dual-perfused group had a lower mean cold ischaemia time, secondary warm ischaemic time, and MELD score (p < 0.001). Actuarial 5-year graft and patient survivals in the aortic-only and dual-perfused cohorts were 80.1% vs 84.6% (p = 0.066), and 82.6% vs 87.8% (p = 0.026), respectively. After accounting for all confounders, graft (HR 0.81, 95% CI 0.60-1.11, p = 0.188) and patient (HR 0.74, 95% CI 0.52-1.05, p = 0.087) survival were not significantly different between both cohorts. There were no differences between both groups with respect to causes of graft loss. Subgroup analyses are being conducted to compare high-risk donors.
Conclusions: The retrieval technique employed does not impact outcomes when all DBD donors are considered together.
USE OF AN ICE BAG TO MINIMIZE THE PERIOD OF SECOND WARM ISCHAEMIC TIME DURING KIDNEY & PANCREAS TRANSPLANTATION – OUR INITIAL EXPERIENCE
YOON Peter1, HAMEED Ahmer1,2, NGUYEN Hien1,3, GASPI Renan4, HAWTHORNE Wayne5, PLEASS Henry1, and YUEN Lawrence1
1Department of Surgery, Westmead Hospital, Sydney, 2Centre for Transplant and Renal Research, The Westmead Institute for Medical Research, 3Department of Urology and Kidney Transplant, Cho Ray Hospital, Vietnam, 4Department of Renal Medicine, Westmead Hospital, Sydney, 5Centre for Transplant and Renal Research, Other
Aims: To assess the safety and feasibility of performing anastomoses for kidney and/or pancreas transplantation after organ immersion in a bag of ice slush.
Methods: Kidneys alone (n = 4) and/or the kidney & pancreas (n = 1) were retrieved from deceased donors and transported to our center using standard cold static storage. After back-table preparation of the graft, each organ was immersed in ice slush within a sterile bowel bag. The bag was sealed superiorly, and a small perforation was made to allow vessel extrusion. During anastomoses, ice slush was replenished as required; anastomoses were performed in a standard manner to the iliac vessels. The bag was removed prior to reperfusion. (Representative video will be shown during presentation).
Results: All transplants were completed safely, without any visual obstruction during anastomoses. Whilst mean anastomotic time for kidneys and pancreas was 49 ± 8 mins and 32 mins, respectively, the second warm ischaemic time for all organs was <1 minute. There were two cases of delayed graft function, both in DCD kidneys (KDPI 98 & 38). One-month creatinine in these recipients was 214 and 134 μmol/L, respectively. Both DBD kidney recipients (KDPI 69 & 74) had immediate graft function. The kidney/pancreas recipient also had immediate graft function, a one-month creatinine of 60 μmol/L, and was off all insulin.
Conclusions: Kidney/pancreas placement in an ice bag is a convenient, simple, and non-obstructive means of minimizing the secondary warm ischaemic insult. A planned RCT will formally test its efficacy.
EVALUATION OF RISK FACTORS FOR ENTERIC LEAKS FOLLOWING SIMULTANEOUS PANCREAS AND KIDNEY TRANSPLANTATION
HORT Amy1, SHAHRESTANI Sara1, HITOS Kerry1, ROBERTSON Paul1, LAM Vincent1, YUEN Lawrence1, RYAN Brendan1, DE ROO Ronald2, HAWTHORNE Wayne J2,3,4, and PLEASS Henry1
1Westmead Hospital, Sydney, 2Department of Surgery, Westmead Hospital, Sydney, 3Discipline of Surgery, Sydney Medical School, University of Sydney, Sydney, Australia, University of Sydney, 4Centre for Transplant and Renal Research, Westmead Institute of Medical Research, Westmead Hospital, Westmead, Australia, Westmead Hospital, Sydney
Introduction: Simultaneous pancreas-kidney transplantation (SPK) is the gold standard treatment for patients suffering Type I Diabetes Mellitus and End-Stage Renal Failure. Enteric drainage is utilised to handle the exocrine drainage, however, enteric leaks (ELs) are one of its more specific and challenging complications. There remains a lack of published research regarding risk factors for ELs, particularly associated with vascular disease.
Methods: SPK transplants performed at Westmead Hospital over ten years (between 2008–2017, n = 234) were analysed to identify ELs. Donor, patient and transplantation procedure risk factors for ELs were collected and analysed. Adjusting for possible confounders, a multivariate logistic regression model was used to assess the risk and predictors of ELs.
Results and Discussion: Of the 234 patients, 12 (5%) experienced an EL. Of these recipients, 9 (75%) had vascular disease, 6 (50%) were ex-smokers, 1 (8%) a current smoker and 3 (25%) were obese with a BMI >30kg/m2. The risk of EL increased by as much as 4.4 fold in recipients with vascular disease (OR: 4.4; 95% CI: 0.80-24.21; P=0.088). Other factors such as recipient BMI >24.2kg/m2 increased the risk of EL by as much as 1.8 fold (OR: 1.8; 95% CI: 0.4-9.3; P=0.46).
Conclusions: We have a possible trend between vascular disease and ELs. These findings also identify other possible risk factors for ELs and the need for further research in this area including careful screening of recipients for vascular disease.
GENETICALLY MODIFIED PORCINE NEONATAL ISLET XENOGRAFTS PROVIDE LONG-TERM FUNCTION IN BABOONS
HAWTHORNE Wayne1,2, CHEW YiVee3, BURNS Heather3, SALVARIS Evelyn4, HAWKES Joanne3, BRADY Jamie5, BARLOW Helen4, YI Shounan3, HU Min3, LEW Andrew5, O'CONNELL Philip3,6, NOTTLE Mark7, and COWAN Peter4,8
1Discipline of Surgery, Sydney Medical School, University of Sydney, 2Centre for Transplant and Renal Research, The Westmead Institute of Medical Research, 3Centre for Transplant and Renal Research, The Westmead Institute for Medical Research, 4Immunology Research Centre, St Vincent's Hospital, Melbourne, 5Walter and Eliza Hall Institute of Medical Research, Melbourne, 6Discipline of Medicine, University of Sydney, 7Department of Obstetrics and Gynaecology, University of Adelaide, 8Department of Medicine, University of Melbourne
Introduction: Alternative strategies such as Xenotransplantation show great promise to be able to provide the organs required to treat diseases such as Type 1 Diabetes.
Aims: To achieve long-term normoglycemia in diabetic baboons transplanted with neonatal pig islets, and to investigate the effect of ceasing immunosuppression.
Materials and Methods: Five diabetic baboons received transplantation of NICC (10,000-50,000 IEQ/kg) from GTKO/CD55-CD59-HT piglets. From day −3 recipients were treated with anti-CD2 induction and maintenance with oral tacrolimus, anti-CD154 and belatacept, which were progressively ceased. Graft survival and function followed by daily blood sugar levels (BSL), IVGTT, OGTT and immunohistochemical analysis of liver biopsies taken at various time points.
Results: No baboon exhibited signs of thrombosis associated with IBMIR. Recipients developed normal fasting BSL and normal IVGTT and OGTT, with porcine insulin and C-peptide secreted in response to glucose stimulus. All animals have become normoglycaemic off all exogenous insulin. Liver biopsies reveal strong positive staining for insulin, glucagon and somatostatin in xenografts. One recipient receiving 50,000 IEQ/kg was insulin-independent for >7 months, including 7 weeks after the last drug (belatacept) was ceased. A second recipient receiving 10,000 IEQ/kg remained insulin independent >18 months, including 6-months off all immunosuppression. The fourth and fifth animals continue to be followed past 6-months and 4-months post transplant.
Conclusion: We have demonstrated for the first time long-term survival and function of porcine islets in baboons. The costimulation blockade-based immunosuppression permitted maturation of the islets such that the dose required to achieve normoglycemia is equivalent to the clinical setting 10,000 IEQ/kg.
HUMAN HLA-DR+CD27+ MEMORY-TYPE REGULATORY T CELLS SHOW POTENT XENOANTIGEN-SPECIFIC SUPPRESSION IN VITRO
LU CAO1,2, MIN Hu1, DANDAN Huang1, XIAOQIAN Ma1,2, LEI Sun1, ELVIRA Jimenez-Vera3, HEATHER Burns1, YUANFEI Zhao1, WAYNE Hawthorne1, SHOUNAN Yi1, and PHILIP O'Connell1
1Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 2Institue for Cell transplantation and Gene Therapy, The 3rd Xiangya Hospital, Central South University, 3Centre for Transplant and Renal Research, Westmead Hospital, Sydney
Introduction: Strategies for immunomodulation of xenograft rejection response whilst minimizing long-term immunosuppression need be developed. We have previously shown that xenoantigen-stimulation enhanced human Treg capacity to suppress the xenogeneic response. However, whether xenoantigen-expanded Treg express specific cell surface markers which can be used for separating a particular xenoantigen-specific Treg subset for an effective Treg therapy needs be identified.
Materials and Methods: Human CD4+CD25+CD127− Treg isolated from healthy donor PBMC were expanded for 3 weeks with anti-CD3/CD28 beads alone or combined with irradiated porcine PBMC as polyclonally (PlTreg) or xenoantigen-stimulated Treg (XnTreg), respectively. FACS was performed to determine candidate cell surface markers and consequent xenoantigen-specific Treg subset was isolated XnTreg by cell sorting. After sorting, the resulting Treg subset was assessed for their suppressive capacity by mixed lymphocyte reaction (MLR) using irradiated porcine PBMC as xenogeneic-stimulating cells, human PBMC as responder cells and autologous XnTreg as suppressing cells.
Results: After 3 weeks of expansion, XnTreg exhibited substantially upregulated expression of HLA-DR and CD27 with a larger proportion of them being HLA-DR+CD27+. The HLA-DR+CD27+ Treg subset from XnTreg demonstrated significantly enhanced potency in suppression of proliferating xenoreactive responder cells at ratios of 1:4 through to 1:64, or 1:32 and 1:64 of Treg:responder cells when compared to HLA-DR+CD27+ cell-depleted or unsorted XnTreg, respectively.
Conclusion: Our data suggest that human HLA-DR+CD27+ memory-type Treg are xenoantigen-specific and have potential as an effective immunotherapy in xenotransplantation.
ENCAPSULATED PIG CELLS SECRETING ANTI-HUCD2 ANTIBODY REDUCES THE NUMBER OF HUMAN CD2 CELLS LOCALLY BUT NOT SYSTEMICALLY IN HUMANIZED MIC
LOUDOVARIS T1, COWAN P2, HAWTHORNE W3, SALVARIS E2, FISICARO N2, CATTERALL T1, KOS C4, MARIANA L1, LEW A5, and KAY T1
1Immunology & Diabetes, St Vincent's Institute, Melbourne, 2Centre for Immunology, St Vincent's Hospital, Melbourne, 3Islet Transplantation Facility, Westmead Millennium Institute, Westmead Hospital, Sydney, 4Immunology & Diabetes, St Vincent's Hospital, Melbourne, 5Department of Immunology, Walter and Eliza Hall Institute of Medical Research, Melbourne
Background: The TheraCyte™ Implantable System, with outer membranes that induce the development of vasculature, was developed to encapsulate and protect cells that secrete insulin or other proteins the patient is deficient in. This implant system has been shown to be biocompatible and protective of allogeneic tissues in animal and human trials. However, the immune protection of xenogeneic tissues (as a potentially unlimited source of therapeutic tissue) has been so far unremarkable, as the intensity of the surrounding inflammatory response suffocates the encapsulated cells.
Method: To mollify or eliminate this local response, pig kidney cell line PK1 genetically engineered (pCIneo_CD2_GFP+) to secrete a monoclonal antibody to human CD2, which inhibits and depletes T cells. PK1 cells transfected with vector alone (pCIneo_GFP+) were used as controls. Encapsulated PK1_pCIneo_CD2_GFP+ or PK1_pCIneo_GFP+, co-encapsulated with porcine neonatal islet cell clusters (NICC’s), were implanted at two sites into immunodeficient NSG mice that were then reconstituted with human PBMCs to generate a human anti-pig response.
Results: There was a statistically significant decrease in the numbers of human T cells around the devices containing anti-CD2 secreting cells, compared to those with control pig cells. This occurred at both sites, while the number of huCD2 cells in the spleen were similar in all mice. We could not determine whether protection was improved or not as all encapsulated Xeno-cells including the NICC’s survived.
Conclusion: Although the xeno-protective properties of anti-CD2 could not be demonstrated, the local impact of secreted factors is a promising result, which cogently advocates further investigation.
© 2018 The Authors. Published by Wolters Kluwer Health, Inc.