In 1998, the Department of Health and Human Services issued its “Final Rule” stating that national liver allocation policies must prioritize distribution of liver grafts “in order of medical urgency while avoiding futile transplantation” (1). As a result, in 2002 the United Network for Organ Sharing (UNOS) adopted and implemented the model for end-stage liver disease (MELD)—an objective allocation system that has no component of waiting time and is free from issues of intercenter reproducibility. Besides the more intuitive variables of serum total bilirubin and international normalized ratio, serum creatinine heavily impacts the MELD formula and its inclusion reflects the influence of renal dysfunction on mortality in patients with liver failure. Various analyses of the impact of the switch to the MELD allocation system have documented several improvements, including a reduction in waiting list mortality even as the listed population was presumably more critical. However, subsequent critiques of the MELD allocation system have demonstrated that patients with high MELD scores may have inferior outcomes after transplantation (2–5). Furthermore, because of the weight of the creatinine value in the MELD formula, liver grafts are being preferentially allocated to patients with renal insufficiency who have high MELD scores. Interestingly, the introduction of the MELD allocation system has coincided with a threefold increase in the number of simultaneous liver-kidney transplants (SLK) performed annually (6). With this increase in the number of SLK transplants has come controversy over whether the addition of the kidney transplant (KT) in all liver candidates with renal failure is associated with long-term benefit or is perhaps wasteful of a limited supply of renal allografts. Thus far studies, such as that of Gonwa et al. (7), have demonstrated no adverse impact of the MELD allocation system on outcomes after SLK transplantation. However, studies aimed at identifying factors that might predict nonrecovery of renal function have produced conflicting results, creating equipoise in the transplant community over the role of SLK transplantation and a need for ongoing reexamination of outcomes (7–11).
We present here an analysis of the UNOS/OPTN national registry that critically evaluates outcomes after SLK transplant in the MELD era and seeks to optimize the use of limited resources by identifying the ideal recipients for an SLK transplant. More specifically, the analysis seeks to examine trends in patient survival among SLK recipients since the introduction of MELD and to identify liver failure patients with renal dysfunction that are most likely to benefit from SLK as defined by improvements in either patient or liver allograft survival.
MATERIALS AND METHODS
Study Design and Population
We retrospectively analyzed a prospective cohort study of deceased donor liver and renal transplant recipients included in the UNOS Standard Transplant Analysis and Research Files. Our study population initially included 235,248 recipients who underwent liver or renal transplantation between January 1987 and June 2006. We then excluded (1) pediatric recipients (<18 years, n=15,334); (2) adult recipients who underwent simultaneous liver–, heart, lung, pancreas, intestine; and kidney–, heart, lung, pancreas, intestine (n=14,313); (3) to ensure adequate follow-up recipients transplanted after 2005 (n=2,447); and (4) after overall trends by transplant year were analyzed, recipients transplanted before the implementation of the MELD allocation system (February 2002, n=43,180).
We stratified the recipients according to the subgroup of donor organ received (liver transplant [LT] alone [n=19,137], KT alone [n=33,712], and SLK [n=1,032]), transplant year, dialysis status (i.e., whether the recipient was on or off dialysis at the time of transplant), and length of time on dialysis. One-year patient survival (PS), liver graft survival (LiGS), and kidney graft survival (KiGS) were the primary outcomes. Information on the time of graft loss was not missing.
Statistical Analyses
Unadjusted PS, LiGS, and KiGS were estimated using Kaplan–Meier methodology, and were compared across donor subgroups and strata with the log-rank test.
With regard to disease severity and complexity, patients who ultimately undergo LT or KT versus SLK represent different groups of patients. These differences make retrospective comparisons between groups difficult, particularly, in the context of standard multivariate regression modeling. In other words, standard modeling may not adequately control for differences, making results overly susceptible to selection bias and confounding. To avoid most of these biases and to account for the inherent differences between LT, KT, and SLK recipients to the extent possible in a retrospective analysis, matched-control analyses were performed. The matched-control analysis for LT versus SLK involved a ratio of three controls (LT) for every one case (SLK), matched for donor age, race, cause of death, and type (split/partial); and recipient final meld score before transplantation as well as dialysis status at the time of transplant. In addition, the matched-control model further controlled for share type (local, regional, or national) and cold ischemic time. The matched-control analysis for KT versus SLK involved a ratio of five controls (KT) for every one case (SLK), matched for expanded criteria donor status, donor ethnicity, and recipient age, ethnicity, and dialysis status.
To account for potential latent matched group effects, such as intragroup correlation, a sensitivity analysis as well as model diagnostics were performed. Specifically, we used a cluster variance estimator, which is robust to misspecification and within cluster specification, and found a reduction in the standard errors suggesting that the intragroup correlations are negative. In addition, within the matched dataset a Cox model with shared frailty was fitted. Results from this sensitivity analysis were robust and consistent with the findings from our initial model. Furthermore, assumptions of proportional hazards were met as determined by examination of Schoenfeld residuals. Finally, the goodness-of-fit of the models were evaluated using graphic representations of the sums of weighted martingale-transform residuals. The predicted probabilities of graft loss and mortality generated by the adjusted models were compared with the observed graft loss and mortality using the Hosmer–Lemeshow chi-squared, and the goodness-of-fit of each Cox model was adequate (χ2, P>0.05).
All tests were two-sided with statistical significance set at the α=0.05 level. All analyses were performed using STATA 10.0 for Linux (Stata Corp, College Station, TX).
RESULTS
Trends in SLK Transplantation in the MELD Era
Since the introduction of the MELD allocation system in February 2002, the annual number of SLK transplants has increased from a stable value of approximately 120 to 325 (Fig. 1A). One-year PS, which had been increasing through 2002, has declined from 87% to 76.1%, and in parallel, the risk for patient death has increased (Fig. 1B). In comparison with the pre-MELD era, the risk for patient death among SLK recipients is 1.41-fold higher in the year 2005 (hazards ratio [HR] 1.41 [1.0, 1.98], P=0.05) (Table 1). In contrast, overall PS after LT has remained constant in the MELD era (Fig. 1B). The recipient population for SLK in the MELD era seems to be older, more often hospitalized, and more often suffering from the hepatorenal syndrome (HRS) when compared with pre-MELD SLK recipients (Table 2). Comparison of average MELD scores between pre- and post-MELD SLK recipients was not possible as these data are not complete from the pre-MELD era, but post-MELD SLK recipients had an average MELD score of 27. Interestingly, the observed decline in PS after SLK in the MELD era is occurring despite the fact that on average allografts transplanted into SLK recipients are from higher quality donors when compared with allografts used for LT recipients. Specifically in the MELD era, SLK allografts were less likely to be regionally or nationally shared and came from younger donors who were less likely to have succumbed to a cerebral vascular accident. Furthermore, the donor risk index, as described by Feng et al. (12), was 1.66 for LT recipients and only 1.55 for SLK recipients (Table 3).
FIGURE 1.:
Trends in SLK transplantation during the MELD era. (A) Number of SLK transplants performed annually; (B) 1-year patient survival among SLK and LT transplant recipients. SLK, simultaneous liver-kidney; LT, liver transplant alone.
TABLE 1: Risk for death among SLK and LT recipients pre-MELD versus post-MELD
TABLE 2: Demographics of SLK recipients pre-MELD versus post-MELD
TABLE 3: Comparison of deceased donor quality between LT and SLK in the MELD era
Patient Survival After SLK Versus LT Alone in the MELD Era
Overall, after matching SLK and LT recipients for multiple factors including donor age, race, cause of death, and type (split/partial); and recipient final meld score before transplantation as well as dialysis status at the time of transplant, SLK transplantation did not confer a PS advantage compared with LT alone (82% vs. 81.8%, P=0.8; HR 1.00 [95% CI 0.80, 1.26], P=1.0). Further, among all recipients on dialysis, SLK transplantation offered no statistically significant PS advantage over LT alone (78.1% vs. 74.7%, P=0.62; HR 0.93 [95% CI 0.69, 1.25], P=0.61). However, further subgroup analysis demonstrated that length of time on dialysis was predictive of long-term outcomes. In fact, after stratifying by length of time on dialysis, it was possible to demonstrate a significant decrease in risk for patient death only among those SLK recipients who were on dialysis for more than 3 months (long-term dialysis) (HR 0.46 [95% CI 0.21, 1.0], P=0.05) (Table 4). Matched-control analysis among recipients on long-term dialysis demonstrated a decreased risk for death among SLK recipients compared with LT alone (87.2% vs. 74.5%, P=0.02). However, after further controlling the matched control analysis for share type and CIT, the finding was no longer statistically significant (HR 0.6 [95% CI 0.34, 1.07], P=0.08) (Fig. 2A). In contrast, among recipients on dialysis for less than 3 months (recent dialysis), SLK transplantation conferred no survival advantage compared with LT alone (73.5% vs. 77.3%, P=0.1; HR 1.22 [95% CI 0.86, 1.73], P=0.26) (Fig. 2B).
TABLE 4: The impact of time spent on dialysis on patient survival among SLK recipients
FIGURE 2.:
Comparison of Kaplan–Meier survival estimates during the MELD era. (A) Comparison of patient survival between SLK and LT alone among recipients on long-term dialysis; (B) comparison of patient survival between SLK and LT alone among recipients on recent dialysis; (C) comparison of liver graft survival between SLK and LT alone among recipients on long-term dialysis; (D) comparison of kidney graft survival between SLK and KT alone among recipients on long-term dialysis. The reported graft survival is non-death censored. SLK, simultaneous liver-kidney; LT, liver transplant alone; KT, deceased donor kidney transplant alone; recent dialysis, length of dialysis <3 months; long-term dialysis, length of dialysis ≥3 months.
Liver Graft Survival After SLK Versus LT Alone in the MELD Era
Again using matched-control analysis, overall SLK transplantation in the MELD era did not confer an LiGS advantage compared with LT alone (79.6% vs. 77.7%, P=0.4; HR 0.91 [95% CI 0.74, 1.13], P=0.40). This lack of LiGS benefit persisted independent of dialysis status (Table 5). However, after further subgroup analysis, among those patients on long-term dialysis, SLK transplantation conferred a 43% reduction in the risk of liver graft loss compared with LT alone (84.5% vs. 70.8%, P=0.008; HR 0.57 [95% CI 0.34, 0.95], P=0.03) (Fig. 2C, Table 3). In contrast, an LiGS benefit was not seen among patients on recent dialysis receiving SLK transplantation compared with LT alone (70.7% vs. 72.4%, P=0.34; HR 1.15 [95% CI 0.83, 1.58], P=0.41) (Fig. 2C, Table 5).
TABLE 5: Matched-control analysis of liver graft survival and the risk for graft loss among SLK and LT alone recipients during the MELD era
Kidney Graft Survival After SLK Versus KT Alone in the MELD Era
Overall 1-year KiGS after SLK in the MELD era is 77.2%. This compares unfavorably with KiGS after KT alone in the MELD era, which is 89.3% (P<0.001). Even among patients on long-term dialysis, the decline in KiGS among SLK recipients compared with KT alone recipients persists (76.1% vs. 88.7%, P<0.001; HR 2.57 [95% CI 1.65, 4.02], P<0.001). Among patients on recent dialysis, SLK recipients have even lower KiGS compared with KT alone recipients (73.7% vs. 90%, P<0.001; HR 4.22 [95% CI 1.41, 12.6], P=0.01) (Table 6). Interestingly, however, detailed analysis among the subgroup of SLK recipients on long-term dialysis demonstrated a significant increase in risk for kidney graft loss among those recipients with MELD scores more than or equal to 23 (HR 3.15 [95% CI 1.17, 8.48], P=0.02) (Table 6). Examining KiGS outcomes after SLK by recipient MELD shows that to achieve results equivalent to those seen among KT alone recipients on long-term dialysis one would have to restrict SLK to those liver failure patients on long-term dialysis with MELD scores less than 23 (Fig. 2D, Table 6).
TABLE 6: Matched-control analysis of kidney graft survival and the risk for graft loss among SLK and KT alone recipients during the MELD era
DISCUSSION
It is well established that long-term results are inferior for patients with hepatic and renal failure who undergo LT alone and remain on dialysis after transplant (13, 14), as mortality rates are greater by 20% per year for such patients (15). SLK transplantation thus emerged as an option for avoiding the deleterious effects of dialysis in the posttransplant period. However, early results demonstrated that SLK transplantation was associated with higher rates of operative mortality and morbidity compared with LT alone, making it imperative to clearly distinguish patients with reversible renal failure from those patients with advanced, irreversible renal failure (16). Multiple studies performed in the pre-MELD era concluded that severity of pretransplant renal failure as opposed to causative factors was more predictive of long-term renal dysfunction (8, 17–20). Before the introduction of the MELD allocation system, the general consensus among the transplant community was that only those liver failure patients with fixed renal disease requiring dialysis for more than 4 weeks should undergo SLK transplantation (8), and as a result appropriately selected SLK recipients were shown to have outcomes similar to patients with intact renal function who underwent LT alone (16). Therefore, before 2002, usage of renal allografts in the context of SLK had been justified by excellent outcomes, and the frequency of SLK had remained low at approximately 100 procedures per year (21). Since the introduction of the MELD allocation system, the recipient population for SLK has evolved in unintended ways, and our analyses suggest that results trends are reversing.
The cause of the decline in outcomes, especially 1-year PS, may reflect the fact that renal transplantation has previously been performed as an elective operation on stable patients, whereas candidates who are too ill to withstand or benefit from transplantation, such as patients suffering form HRS, have remained on dialysis. In the case of recent evolving trends with SLK post-MELD, kidney grafts are now allocated preferentially to older, more critically ill recipients often with HRS, some of whom may already have multisystem disease processes that are too far advanced to benefit from either graft. Furthermore, the kidney grafts that are transplanted into SLK recipients are from higher quality donors, suggesting that the best deceased donor kidneys may be going to a group of recipients too sick to maximize the potential benefit of this limited resource. In fact, if any of the kidney grafts that failed after SLK transplantation during the MELD era had been allocated instead to a patient on the kidney-only waiting list, the result would have been a patient receiving an extra 7.2 years of life (22). An additional concern is that some of the expanding group of MELD-era SLK recipients who are being transplanted for the HRS would have recovered native renal function after LT alone, and thus received a KT needlessly. Our inability to determine exactly who needs an SLK in the MELD-era represents another route by which the highest quality deceased donor renal allografts are being wasted as part of a dual transplant.
Review of the UNOS transplant registry suggests that SLK may be beneficial only for liver failure patients on dialysis for more than 3 months. More specifically, our analysis demonstrated that compared with LT alone SLK was associated with higher 1-year PS and LiGS only among those liver failure patients on dialysis for at least 3 months. Prioritization of kidneys to these liver failure patients as part of an SLK represents a “liver-centric” approach: the kidney is being used to maximize the outcome obtained from the LT. Interestingly, since February 2002, more than 1,000 SLK transplants have been performed, yet of those 1,032 only 318 went to patients on long-term dialysis, suggesting that more than two-thirds of kidneys given to SLK recipients are wasted. If we wished to alter organ allocation priorities to achieve kidney graft outcomes in SLK transplantation similar to those seen in deceased donor kidney alone transplantation, we would have to restrict SLK transplantation to those liver failure patients on long-term dialysis with MELDs of 22 or lower. In the current MELD era, we have performed only 110 SLK transplants in liver failure patients meeting these strict criteria. This kind of a restrictive policy would represent a “kidney-centric” approach that seeks to preserve the outcomes obtained from this resource when compared with the gold standard deceased donor kidney alone transplant. Such restrictive allocation decisions would impact negatively on patient and LiGS in those SLK candidates with higher MELDs who had been on dialysis for more than 3 months, highlighting the need for a cost-benefit analysis and discussion within the transplant community as a whole. It is important to note, however, that independent of either a liver or kidney centric approach two-thirds of all SLK transplants performed during the MELD era have gone to recipients in whom no benefit can be documented, yet approximately 4000 end-stage renal disease patients die each year awaiting KT anyone of whom could have benefited from a deceased donor kidney (21).
We stress that the results from this study are preliminary and available data are limited. It is clear that SLK and LT recipients are not identical patient populations. These differences make retrospective comparisons between groups difficult, as results are more susceptible to selection bias and confounding such as surgeon intuition and clinical judgment used to determine whether a liver failure patient receives SLK or LT. Because standard modeling may not adequately control for these differences and biases, we performed a matched-control analysis in which SLK and LT recipients were matched based on multiple donor, recipient, and allograft characteristics, including donor age, race, cause of death, split/partial, and donor risk index, recipient final meld score before transplant and dialysis status, and cold ischemic time and share type. In addition, our analyses were restricted by the limited follow-up data in the post-MELD era, and as a result to ensure complete 1-year follow-up, recipients transplanted after 2005 were excluded from the analyses. Furthermore, our analysis did not control for cause of kidney failure, as this variable is incomplete within the dataset. It is feasible, however, that length of time on dialysis served as a surrogate. Clearly, liver failure patients on recent dialysis had worse outcomes, and we could not demonstrate any benefit of SLK over LT among this group of patients. It is likely that this group of SLK recipients more often had kidney failure secondary to the HRS. This assumption and the demonstrated lack of benefit among recent dialysis recipients of SLK transplants are consistent with the recent findings of Ruiz et al. (23), which suggest that SLK transplantation offers no survival benefit to HRS patients. Finally, we acknowledge that SLK and KT recipients are also not identical patient populations. The intent of the analysis was merely to provide perspective on the severity of restrictions required to obtain kidney allograft survival rates among SLK recipients comparable with KT recipients. Nevertheless, it is clear that the relatively small number of SLK transplants performed annually requires a national analysis of the data in order to completely understand the role of SLK transplantation.
Although further studies and longer follow-up intervals are needed to assess the degree and reproducibility of these observations, the post-MELD SLK experience does suggest that SLK among liver failure patients on long-term dialysis is associated with an overall net benefit in patient and graft years. Modifying practice is often a cumbersome process, but can be a worthwhile endeavor. In an era of progressive organ shortage, we must strive to optimally balance individual urgency with utilitarian benefit as new evidence regarding outcomes, whether positive or negative, becomes available. Our findings call into question the benefit of SLK in the MELD era as it is currently practiced and suggest that current prioritization of kidney allografts to liver failure patients often results in wasting of limited resources. SLK should be reserved for those liver failure patients on dialysis for at least 3 months, because only these recipients can be shown to gain an overall patient survival and liver allograft survival advantage over LT alone.
REFERENCES
1.98–8191 FRFD. CFR Part 121, 1998; 42: 16296.
2.Habib S, Berk B, Chang CC, et al. MELD and prediction of post-liver transplantation survival.
Liver Transpl 2006; 12: 440.
3.Saab S, Wang V, Ibrahim AB, et al. MELD score predicts 1-year patient survival post-orthotopic liver transplantation.
Liver Transpl 2003; 9: 473.
4.Shiffman ML, Saab S, Feng S, et al. Liver and intestine transplantation in the United States, 1995–2004.
Am J Transplant 2006; 6 (5 Pt 2): 1170.
5.Yoo HY, Thuluvath PJ. Short-term postliver transplant survival after the introduction of MELD scores for organ allocation in the United States.
Liver Int 2005; 25: 536.
6.Davis CL, Feng S, Sung R, et al. Simultaneous liver-kidney transplantation: Evaluation to decision making.
Am J Transplant 2007; 7: 1702.
7.Gonwa TA, McBride MA, Anderson K, et al. Continued influence of preoperative renal function on outcome of orthotopic liver transplant (OLTX) in the US: Where will MELD lead us?
Am J Transplant 2006; 6: 2651.
8.Campbell MS, Kotlyar DS, Brensinger CM, et al. Renal function after orthotopic liver transplantation is predicted by duration of pretransplantation creatinine elevation.
Liver Transpl 2005; 11: 1048.
9.Cohen AJ, Stegall MD, Rosen CB, et al. Chronic renal dysfunction late after liver transplantation.
Liver Transpl 2002; 8: 916.
10.Marik PE, Wood K, Starzl TE. The course of type 1 hepato-renal syndrome post liver transplantation.
Nephrol Dial Transplant 2006; 21: 478.
11.Pawarode A, Fine DM, Thuluvath PJ. Independent risk factors and natural history of renal dysfunction in liver transplant recipients.
Liver Transpl 2003; 9: 741.
12.Feng S, Goodrich NP, Bragg-Gresham JL, et al. Characteristics associated with liver graft failure: The concept of a donor risk index.
Am J Transplant 2006; 6: 783.
13.Davis CL. Impact of pretransplant renal failure: When is listing for kidney-liver indicated?
Liver Transpl 2005; 11(11suppl 2): S35.
14.Ojo AO, Held PJ, Port FK, et al. Chronic renal failure after transplantation of a nonrenal organ.
N Engl J Med 2003; 349: 931.
15.Sanchez EQ, Klintmalm GB. Combined liver-kidney transplantation. In: Busutill RW, Klintmalm GB, eds. Transplantation of the liver, [ed. 2].Philadelphia, Saunders, 2005.
16.Moreno-Gonzalez E, Meneu-Diaz JC, Garcia I, et al. Simultaneous liver- kidney transplantation for adult recipients with irreversible end-stage renal disease.
Arch Surg 2004; 139: 1189.
17.Davis CL, Gonwa TA, Wilkinson AH. Identification of patients best suited for combined liver-kidney transplantation: Part II.
Liver Transpl 2002; 8: 193.
18.Gonwa TA, Morris CA, Goldstein RM, et al. Long-term survival and renal function following liver transplantation in patients with and without hepatorenal syndrome—experience in 300 patients.
Transplantation 1991; 51: 428.
19.Jeyarajah DR, Gonwa TA, McBride M, et al. Hepatorenal syndrome: Combined liver kidney transplants versus isolated liver transplant.
Transplantation 1997; 64: 1760.
20.Jeyarajah DR, McBride M, Klintmalm GB, et al. Combined liver-kidney transplantation: What are the indications?
Transplantation 1997; 64: 1091.
21.Gonwa TA. Combined kidney liver transplant in the MELD era: Where are we going?
Liver Transpl 2005; 11: 1022.
22.Schnitzler MA, Whiting JF, Brennan DC, et al. The life-years saved by a deceased organ donor.
Am J Transplant 2005; 5: 2289.
23.Ruiz R, Barri YM, Jennings LW, et al. Hepatorenal syndrome: A proposal for kidney after liver transplantation (KALT).
Liver Transpl 2007; 13: 838.