To mitigate the widening gap between available donor livers and the number of patients awaiting a life-saving transplant, there have been ongoing efforts to maximize utilization of all available organs, including marginal grafts.1 One of the most common marginal grafts encountered in routine clinical practice is a steatotic liver, which may be present in as many as 59% of all deceased donors.2,3 In light of the ongoing obesity epidemic, where as many of 37.7% of adults in the United States are obese (body mass index [BMI] ≥ 30), steatotic donor livers (SDLs) are likely to become even more commonplace.4 Thus, liver transplant candidates will increasingly be faced with the decision to accept an SDL or wait for a “more acceptable” offer.
A number of studies have shown that SDLs are associated with an increased risk of primary nonfunction and graft failure.5-10 For example, an analysis of national registry data showed that transplantation with an SDL was associated with a 71% increased risk of posttransplant graft loss.5 However, these comparisons assume that a candidate can choose between simultaneous offers of an SDL and a non-SDL, which does not accurately reflect the real-world scenario. Instead, the decision most commonly faced is whether a candidate should accept an SDL, or decline that offer and wait, potentially indefinitely, for another (presumably better, non-SDL) offer. While awaiting a “more acceptable” offer, candidates who decline an SDL may die or become too sick for transplant, and thus may obtain survival benefit from accepting an SDL that outweighs the associated risks. An improved understanding of the trade-offs associated with accepting or declining an SDL would better inform the real-time decision liver transplant candidates often face.
Therefore, we used national registry data to understand the trade-offs associated with declining or accepting an SDL. The goals of our study were to (1) characterize the natural history of candidates who declined an SDL and (2) compare postdecision survival after declining versus accepting an SDL, to inform decision-making for liver transplant candidates and providers.
MATERIALS AND METHODS
This study used data from the Scientific Registry of Transplant Recipients (SRTR). The SRTR data system includes data on all donor, waitlisted candidates, and transplant recipients in the United States, submitted by the members of the Organ Procurement and Transplantation Network (OPTN), and has been described elsewhere.11 The Health Resources and Services Administration (HRSA), US Department of Health and Human Services, provides oversight to the activities of the Organ Procurement and Transplantation Network and SRTR contractors.
Definition of Macrosteatosis
To determine whether there was steatosis in a given donor liver, we used the liver biopsy macrosteatosis variable in SRTR, which reports the percent of macrosteatosis present on a pretransplant liver biopsy. No values are recorded for livers that are not biopsied. We defined an SDL as any liver with ≥30% macrosteatosis on biopsy.
We included all adult (age ≥ 18 y) liver transplant candidates who were offered an SDL that was eventually used for transplant between December 25, 2009, and January 6, 2015. For potentially misclassified acceptances or declines, offers reported as accepted in the match-run data that did not have an associated transplant record in the SRTR standard analytical files were considered declines (N = 34), and offers reported as declines in the match-run data that did have an associated transplant record were considered acceptances (n = 57). We compared characteristics of those who accepted an SDL to those who declined using the χ2 test for categorical variables, Student’s t-test for normally distributed continuous variables, and the Kruskal–Wallis test for nonnormally distributed continuous variables. This study was approved by the Johns Hopkins University Institutional Review Board. This study complies with the ethical principles outlined by the Declaration of Helsinki and the Declaration of Istanbul.
Outcomes After Declining an SDL
To understand the natural history of declining an SDL, we determined the postdecline cumulative incidence of death, waitlist removal, remaining waitlisted, and receipt of a living donor liver transplant, transplant with a different SDL, or transplant with a non-SDL. Since any one of these outcomes generally precludes the others (ie, informative censoring), we calculated the 6-year cumulative incidence of these outcomes under a competing risks framework using the method of Fine and Gray.12
Survival Benefit of Accepting an SDL
Candidates who accepted and were transplanted with an SDL were matched to all candidates within ±2 model for end-stage liver disease (MELD) points who had declined the same liver. We used allocation MELD since this best reflects the likelihood that a candidate who declines will get a subsequent offer, and it includes exception points for candidates for whom their biologic MELD does not accurately capture their likelihood of waitlist mortality or dropout for other reasons. Candidates were then followed until the date of death or date of administrative censoring (December 31, 2016), irrespective of subsequent transplant. In other words, if a candidate declined and later received a transplant (with either an SDL or non-SDL), they were still followed from the date of decline until death or administrative censorship. To account for differences in organ quality between offers, we compared survival for candidates who accepted to the survival for candidates who declined that same liver. Candidates who accepted an SDL at the beginning of the match run were excluded (n = 373), as there are no matched controls who declined on that specific match run. To determine whether these candidates had acceptable posttransplant outcomes, we performed a post hoc survival analysis on these 373 candidates.
We then used Cox regression to quantify the survival benefit of accepting an SDL compared to declining and waiting for another offer, adjusting for candidate age, race, sex, and cause of end-stage liver disease. Since candidates who accepted an SDL faced an increased perioperative mortality risk that candidates who declined an SDL did not, the proportional hazards assumption was not met (ie, the relative hazard of mortality after accepting an SDL varied for time). Therefore, we divided our study period into 2 distinct periods: the first month postdecision (to account for this perioperative risk period) and beyond 1 month postdecision. In doing so, we calculated the relative hazard of mortality after accepting an SDL separately during each period. These periods were determined empirically.
Finally, we tested whether the survival benefit of accepting an SDL varied by candidate BMI ≥ 30, MELD (6–21, 22–28, 29–34, or 35–40), indication for transplant (hepatocellular carcinoma [HCC] versus non-HCC), or age ≥ 60 years by including an interaction term for each in our model.
To determine whether candidates with MELD < 15 were confounding our initial observations, we performed a sensitivity analysis excluding any candidate with a MELD < 15, since there may not be a survival benefit with transplantation for these candidates, or their lower acuity may confound posttransplant outcomes. To determine whether including candidates who declined multiple SDLs as independent controls biased our results, we also performed a sensitivity analysis with 1:1 matching without replacement. Additionally, it is possible that some SDLs were declined for donor/recipient size mismatch, or for the candidate being too sick for transplant, rather than based on the degree of steatosis itself. To account for this, we performed a sensitivity analysis excluding candidates who declined for either of these 2 reasons.
Confidence intervals are reported as per the method of Louis and Zeger.13 All analyses were performed using Stata 15.0/IC for Windows (College Station, TX).
We identified 759 candidates who accepted an SDL, and 13 362 matched candidates who declined. Compared to candidates who declined an SDL, candidates who accepted were of similar age (56.4 versus 56.5 y, P = 0.7) but were less likely to be female (24.8% versus 34.0%, P < 0.001) or be black (7.9% versus 8.0%, P < 0.001) (Table 1). However, they were more likely to have a higher MELD (median [interquartile range (IQR)]: 23 [20–28] versus 19 [16–23], P < 0.001) and have hepatocellular carcinoma as the indication for transplant (17.9% versus 12.1%, P < 0.001). The median (IQR) percent macrosteatosis of accepted SDLs was 30% (30%–40%) (Figure 1). Only 10% of accepted SDLs had ≥60% macrosteatosis. The median donor risk index for accepted livers was 1.61 (IQR: 1.33–1.92), and the median cold ischemia time was 6.4 hours (IQR: 5.1–8.0).
Outcomes After Declining an SDL
At 6 years postdecline, only 53.1% of candidates were subsequently transplanted: 44.3% received a non-SDL transplant, 6.6% received a different SDL transplant, and 2.2% received a living donor liver transplant (Figure 2). In contrast, 23.8% died while waiting, 19.4% were removed from the waitlist, and 5.9% of recipients remained waitlisted. Among candidates who declined, median time to transplant with a non-SDL was 4.3 months.
Survival Benefit of Accepting an SDL
Survival for candidates who accepted an SDL versus candidates who declined at 1 year postdecision was 88.8% versus 82.5%, and 5-year survival postdecision was 77.6% versus 60.1% (P < 0.001) (Figure 3). After adjusting for candidate characteristics, accepting an SDL was associated with a 1-month perioperative period of increased mortality risk (adjusted hazard ratio [aHR]: 2.493.494.89, P < 0.001), but a 62% reduced mortality risk beyond 1 month postdecision (aHR: 0.310.380.46, P < 0.001) (Table 2). Although accepting an SDL was associated with an increased mortality risk in the first month postdecision, this translated to a crude incidence of mortality 5.7% for those who accepted compared to 1.7% for those who declined.
The magnitude of the long-term survival benefit of accepting an SDL did not vary for candidates with a BMI ≥ 30, age ≥ 60 years, HCC as their indication for transplant, or by MELD (P for all interactions >0.1). However, the magnitude of the brief period of increased mortality risk in the first month postdecision did vary by candidate MELD (Figure 4). For candidates with a MELD 6–21, accepting an SDL was associated with a 7.88-fold increased mortality risk (aHR: 4.807.8812.93, P < 0.001) in the first month postdecision (crude incidence of mortality: 8.1% versus 1.3%). Conversely, candidates with MELD 22–28 (aHR: 0.901.572.73, P = 0.1; crude incidence of mortality: 3.9% versus 3.0%) or MELD 29–34 (aHR: 0.802.195.99, P = 0.1; crude incidence of mortality: 4.5% versus 2.9%) did not have a statistically significant increased mortality risk during the first month postdecision. Candidates with MELD 35–40 had a significantly reduced mortality risk (aHR: 0.110.320.90, P = 0.03) during the first month postdecision (crude incidence of mortality: 9.3% versus 27.7%). However, these differences were not likely driven by large differences in donor quality between MELD groups, since the median donor risk index was similar between lower and higher MELD candidates (1.63 in MELD 6–21 versus 1.61 in MELD 22+, P = 0.1).
There were 373 candidates who accepted an SDL at the beginning of the match run and were not included in our primary analysis. Compared to candidates who accepted a previously declined SDL, candidates who accepted at the beginning of the match run were of similar age (55.6 versus 56.4 y, P = 0.2), gender (28.7% female versus 24.8%, P = 0.2), and race (77.2% white versus 75.5%, P = 0.6). However, these candidates had a significantly higher MELD (median 29 versus 23, P < 0.001). Posttransplant survival was similar for candidates who accepted an SDL at the beginning of the match run versus those who accepted a previously declined SDL (1 y: 89.3% versus 88.8%; 5 y: 80.3% versus 77.6%, P = 0.4). After adjustment, candidates who accepted an SDL at the beginning of the match run had an equivalent mortality risk compared to candidates who accepted a previously declined SDL (aHR: 0.580.791.06, P = 0.1).
In a sensitivity analysis excluding candidates with a MELD < 15 (n = 2885, 20.6%), accepting an SDL continued to be associated with a 1-month period of increased mortality risk (aHR: 2.173.054.29, P < 0.001) but a substantially reduced long-term mortality risk (aHR: 0.310.380.46, P < 0.001) (Table 2). In other words, candidates with a MELD < 15 did not confound our results.
When we excluded candidates who declined an SDL because they were deemed too sick for transplant (n = 222, 1.6%) or there was a donor/recipient size mismatch (n = 528, 3.8%), accepting an SDL continued to be associated with a 1-month period of increased mortality risk (aHR: 3.404.816.80, P < 0.001) but a substantially reduced long-term mortality risk (0.350.430.52, P < 0.001). In other words, when we excluded candidates who might be declining for reasons other than steatosis, our inferences remained unchanged.
Finally, we performed a sensitivity analysis, where we selected our controls with 1:1 matching without replacement. This deemphasizes the impact of low MELD controls, since there were significantly more matched controls available for low MELD candidates who accepted, all of whom were included in our primary analysis. In this sensitivity analysis, accepting an SDL was associated with a reduced mortality risk in the first month postdecision (aHR: 0.410.540.70, P < 0.001), as well as a substantially reduced long-term mortality risk (aHR: 0.370.520.72, P < 0.001). Thus, after deemphasizing the impact of low MELD controls, the negative impact of accepting an SDL in the first month posttransplant was attenuated and actually became a survival benefit.
In this national study, we have shown that 23.8% of candidates who declined an SDL died while waiting for transplant, and 19.4% were removed from the waitlist before transplant. In light of this, we showed that accepting an SDL was associated with a 62% reduced mortality risk (aHR: 0.310.380.46, P < 0.001) beyond the first month postdecision, which remained consistent after excluding MELD < 15 candidates, using 1:1 matching without replacement, and excluding candidates who declined because the candidate was too sick to transplant or for donor/recipient size mismatch. Although acceptance was associated with a brief perioperative risk period in the first month postdecision (aHR: 2.493.494.89, P < 0.001), this varied across candidate MELD. Candidates with MELD 6–21 had a 7.88-fold increased mortality risk (aHR: 4.807.8812.93, P < 0.001) in the first month postacceptance, whereas candidates with MELD 35–40 had a significantly reduced mortality risk (aHR: 0.110.320.90, P = 0.03). Additionally, survival for candidates who accepted an SDL at the beginning of the match run was equivalent to candidates who accepted a previously declined SDL. Our results suggest that, on average, candidates would derive a survival benefit from accepting an SDL, rather than declining it and waiting for another “more acceptable” offer.
Although the use of SDLs remains controversial, several single-center studies have reported excellent outcomes using these grafts, particularly in carefully selected recipients.14-17 We have extended this work by quantifying the trade-offs associated with the real-world decision—should a candidate accept an SDL offered to them, or decline and wait for another offer. Our results suggest that acceptance of SDLs is associated with a long-term survival benefit, despite an increased perioperative risk period. In light of high discard rates of livers from donors with high BMI and projections suggesting that as many as 58% of all donors will be overweight by 2030, surgeons should carefully consider whether each SDL could be used safely in a particular candidate and not decline solely based on the degree of macrosteatosis.18,19
Although we have shown a long-term survival benefit for those accepting an SDL, there was a substantial increase in mortality within the first month postdecision for those who accepted. While this comparison at least partially reflects a transient perioperative risk period (compared to candidates not undergoing any operation), which might be expected, it may also represent instances of primary nonfunction. Recipients of SDLs are at increased risk of primary nonfunction, which has devastating clinical consequences and can result in early recipient mortality.20-22 Nevertheless, the overall absolute risk of 1-month mortality was 5.7%, which is consistent with 1 national study that reported an absolute inhospital mortality risk of 6.3% after liver transplant from all types of donors, including standard liver allografts.23
Notably, MELD 6–21 candidates had a nearly 8-fold increased risk of mortality in the first month posttransplant (absolute mortality 8.1%). Although these candidates still derived a long-term survival benefit from accepting an SDL despite this early risk, the decision to accept or decline an SDL should consider the likelihood of early posttransplant mortality for each candidate individually. In contrast to MELD 6–21 candidates, MELD 22–34 candidates did not experience this increased perioperative risk period, and MELD 35–40 candidates actually experienced a survival benefit, likely reflecting their higher risk of waitlist mortality while waiting for another offer. Thus, the decision to accept or decline an SDL should balance each recipient’s risk of early postoperative mortality with the substantial long-term survival benefit provided by these SDLs.
Several limitations of our study are worth considering. First, to control for differences in organ quality between organs ultimately used for transplant and those discarded, we limited our analysis to only those organs that were declined by at least 1 person but ultimately used for transplant. As a result, our study could be confounded by indication if there were differences between candidates who accepted versus candidates who declined. However, our use of candidates who declined that same liver as our control group most closely approximates the counterfactual candidate for someone who accepted, since they had the opportunity to accept but did not. Moreover, we matched each candidate who accepted to those who declined but were within ±2 MELD points, accounting for broad differences in severity of underlying liver disease. In other words, the major difference between comparison groups was that 1 candidate accepted, whereas the others did not (since the donor was same for candidates who accepted and declined, and the recipients were matched within ±2 MELD points). Additionally, the only biopsy information available in SRTR is the percent macrosteatosis (or percent microsteatosis), and we are unable to account for additional pathologic characteristics that might affect the decision to accept or decline (eg, presence of steatohepatitis or fibrosis). Additionally, we do not know whether the percent macrosteatosis recorded was based on an assessment by a liver-trained pathologist or not, and significant between-pathologist variability in steatosis estimation has been described.24 Nevertheless, there is face validity to these estimates, as outcomes have been shown to be worse with SDLs versus non-SDLs when using this variable.5 The major strength of our study is the use of national registry data, which allows for a well-powered study of a rare exposure (the use of SDLs) that is nationally representative and results that are broadly generalizable across the country.
In conclusion, we have shown that accepting an SDL was associated with a substantial survival benefit compared to declining and waiting for the next best offer, although there was a brief perioperative period of increased mortality risk in the first month posttransplantation. This early increased mortality risk varied across candidate MELD and was only significant for candidates with MELD 6–21, and candidates with MELD 35–40 actually had reduced mortality risk during this period. Given the ongoing organ shortage and the increasing prevalence of obesity in the donor pool, our results support continued use of SDLs in carefully selected recipients.
The analyses described here are the responsibility of the authors alone and do not necessarily reflect the views or policies of the Department of Health and Human Services, nor does mention of trade names, commercial products, or organizations imply endorsement by the US Government. The data reported here have been supplied by the Hennepin Healthcare Research Institute as the contractor for the SRTR. The interpretation and reporting of these data are the responsibility of the author(s) and in no way should be seen as an official policy of or interpretation by the SRTR or the US Government.
1. Hashimoto K, Miller C. The use of marginal grafts in liver transplantation. J Hepatobiliary Pancreat Surg. 2008; 15292–101
2. Linares I, Hamar M, Selzner N, et al. Steatosis in liver transplantation: current limitations and future strategies. Transplantation. 2019; 103178–90
3. Hałoń A, Patrzałek D, Rabczyński J. Hepatic steatosis in liver transplant donors: rare phenomenon or common feature of donor population? Transplant Proc. 2006; 381193–195
4. Flegal KM, Kruszon-Moran D, Carroll MD, et al. Trends in obesity among adults in the United States, 2005 to 2014. JAMA. 2016; 315212284–2291
5. Spitzer AL, Lao OB, Dick AA, et al. The biopsied donor liver: incorporating macrosteatosis into high-risk donor assessment. Liver Transpl. 2010; 167874–884
6. de Graaf EL, Kench J, Dilworth P, et al. Grade of deceased donor liver macrovesicular steatosis impacts graft and recipient outcomes more than the donor risk index. J Gastroenterol Hepatol. 2012; 273540–546
7. Deroose JP, Kazemier G, Zondervan P, et al. Hepatic steatosis is not always a contraindication for cadaveric liver transplantation. HPB (Oxford). 2011; 136417–425
8. Gabrielli M, Moisan F, Vidal M, et al. Steatotic livers. Can we use them in OLTX? Outcome data from a prospective baseline liver biopsy study. Ann Hepatol. 2012; 116891–898
9. Dutkowski P, Schlegel A, Slankamenac K, et al. The use of fatty liver grafts in modern allocation systems: risk assessment by the balance of risk (BAR) score. Ann Surg. 2012; 2565861–8. Discussion 868
10. Noujaim HM, de Ville de Goyet J, Montero EF, et al. Expanding postmortem donor pool using steatotic liver grafts: a new look. Transplantation. 2009; 876919–925
11. Massie AB, Kucirka LM, Kuricka LM, et al. Big data in organ transplantation: registries and administrative claims. Am J Transplant. 2014; 1481723–1730
12. Fine JP, Gray RJ. A proportional hazards model for the subdistribution of a competing risk. J Am Stat Assoc. 1999; 94446496–509
13. Louis TA, Zeger SL. Effective communication of standard errors and confidence intervals. Biostatistics. 2009; 1011–2
14. Angele MK, Rentsch M, Hartl WH, et al. Effect of graft steatosis on liver function and organ survival after liver transplantation. Am J Surg. 2008; 1952214–220
15. Chavin KD, Taber DJ, Norcross M, et al. Safe use of highly steatotic livers by utilizing a donor/recipient clinical algorithm. Clin Transplant. 2013; 275732–741
16. Doyle MB, Vachharajani N, Wellen JR, et al. Short- and long-term outcomes after steatotic liver transplantation. Arch Surg. 2010; 1457653–660
17. McCormack L, Petrowsky H, Jochum W, et al. Use of severely steatotic grafts in liver transplantation: a matched case-control study. Ann Surg. 2007; 2466940–6. Discussion 946
18. Orman ES, Barritt AS 4th, Wheeler SB, et al. Declining liver utilization for transplantation in the United States and the impact of donation after cardiac death. Liver Transpl. 2013; 19159–68
19. Orman ES, Mayorga ME, Wheeler SB, et al. Declining liver graft quality threatens the future of liver transplantation in the United States. Liver Transpl. 2015; 218104–1050
20. Imber CJ, St Peter SD, Handa A, et al. Hepatic steatosis and its relationship to transplantation. Liver Transpl. 2002; 85415–423
21. Koehler E, Watt K, Charlton M. Fatty liver and liver transplantation. Clin Liver Dis. 2009; 134621–630
22. Chen XB, Xu MQ. Primary graft dysfunction after liver transplantation. Hepatobiliary Pancreat Dis Int. 2014; 132125–137
23. Gil E, Kim JM, Jeon K, et al. Recipient age and mortality after liver transplantation: a population-based cohort study. Transplantation. 2018; 102122025–2032
24. El-Badry AM, Breitenstein S, Jochum W, et al. Assessment of hepatic steatosis by expert pathologists: the end of a gold standard. Ann Surg. 2009; 2505691–697