Journal Logo

Clinical Transplantation

Donor characteristics associated with reduced graft survival: an approach to expanding the pool of kidney donors1

Port, Friedrich K.2 11; Bragg-Gresham, Jennifer L.2; Metzger, Robert A.3; Dykstra, Dawn M.2; Gillespie, Brenda W.4; Young, Eric W.5 6; Delmonico, Francis L.7; Wynn, James J.8; Merion, Robert M.9; Wolfe, Robert A.10; Held, Philip J.2

Author Information
  • Free

Abstract

The need for cadaveric kidney donors in the United States is growing, as documented by the large annual increase in patients on the waiting list for cadaveric organs relative to the very modest growth in their availability (1). Numerous efforts have been made to increase organ donation. In an effort to learn from successful regions, geographic differences in organ donation have been assessed by either donors per million population (2) or by evaluable deaths occurring in hospitals within the most relevant age groups (3). The large differences observed suggest that opportunities for increasing cadaveric organ retrieval exist in certain regions of the United States.

Lessons also can be learned from international experiences. In recent years, cadaveric kidney donations have increased dramatically in Spain, particularly among older age groups (1,4). Although Spain’s rate of kidney donation is similar to that of the United States on a per million, age-specific population basis for donors less than 40 years old, it has exceeded U.S. rates for older age groups by an almost twofold margin in recent years.

Kidneys procured from donors older than 55 to 60 years of age are known to be associated with significantly worse graft survival (2,5,6). Therefore, some have labeled older donors, and those with a history of hypertension and diabetes, as “marginal” or “expanded” donors (5). Organs procured from such donors have an increased discard rate because they are frequently turned down by transplant centers. However, a recent study by Ojo et al. (5) compared the mortality risk for waitlisted dialysis patients with that of recipients of kidneys from expanded donors. The authors showed that even transplantation of expanded kidneys yields a substantial survival advantage over maintenance on dialysis. This gain occurred with a delay, reaching a cumulative benefit 1.5 years after transplantation.

Based on these observations, more expanded organs could be used, which would help reduce the current organ shortage. Another benefit would be the reduced risk of graft loss and death that is associated with shorter time on dialysis therapy (7). In Spain, waiting times have decreased as the waiting list has grown shorter. This has been attributed to the inclusion of older organ donors (4,8).

The Spanish practices may be considered for implementation in the United States. In early 2001, participants at a consensus meeting in Crystal City, VA, discussed these new findings with the goal of enhancing cadaveric kidney transplantation in the United States. It was suggested that patients be given the option of being considered for all organs, including those with an increased risk of graft failure, a choice that could substantially shorten a patient’s waiting time. Patients who agree to accept the added risk would be offered the opportunity of undergoing transplantation with standard or expanded donor kidneys, whereas those who do not would be eligible to receive standard kidneys only. Before implementing this approach, it would be useful to have the following: (1) a clearer definition of donor factors associated with increased risk of graft loss, (2) an evaluation of the characteristics of current recipients of expanded kidneys under the existing nonstratified waiting list system, (3) an assessment of the relative importance of HLA matching and cold ischemia time (CIT) for recipients of such organs, and (4) an evaluation of the magnitude of current organ discard rates.

This current study was undertaken to provide information from relevant national data to refine the definition of expanded donor kidneys, based on recent practice in the United States. This study also sought to provide a factual basis for possible modifications of national allocation policy with the broader goal of achieving greater utilization of retrieved cadaveric kidneys.

METHODS

Source of Data

This study was based on information concerning graft and patient survival from the Scientific Registry of Transplant Recipients, as collected by the Organ Procurement and Transplantation Network. All recipients of first cadaveric kidney transplants in the United States between March 6, 1995 and November 30, 2000 were eligible for inclusion in the study. Data about donor and recipient characteristics were collected at the time of transplantation. Follow-up information was collected at 6 months, then yearly after transplantation. Patients were excluded if they had had a multiorgan transplant. Patients were also excluded if they were missing donor information on predictor variables of interest for graft outcomes (20.9%). After exclusions, the study sample numbered 29,068.

Analytical Methods

All statistical analyses were performed using SAS 8.0 (9). Predictors of time to graft failure were investigated using Cox regression models. Time to graft loss was assessed as time from date of transplantation to the date of graft failure or death, whichever came first. The following donor variables were included in these models: age at death (by decade), gender, race, year of donation, history of diabetes, history of hypertension, impaired renal function (terminal serum creatinine >1.5 mg/dL), and cause of death. Adjustments for recipients were also made to assess more clearly the graft outcomes by donor factors, independent of recipient characteristics. The following recipient factors were included in the models: age, gender, race, ethnicity, body mass index (BMI), cause of end-stage renal disease (ESRD) (diabetes, hypertension, glomerulonephritis, or other), time on ESRD therapy, pretransplantation blood transfusions (yes/no), number of HLA mismatches (MM), panel-reactive antibody (PRA) level (0–9, 10–79, and >80), and CIT.

Expanded donors were defined as those donors with characteristics that yielded a relative risk (RR) of graft loss greater than 1.7 compared with the risk associated with donors aged 10 to 39 years with a terminal serum creatinine less than or equal to 1.5 mg/dL and with neither hypertension nor cerebrovascular accident (CVA) as cause of death, with other factors held constant.

An additional Cox model was developed to evaluate the net effects that would be expected from projected shorter CIT and allocation without regard to HLA matching beyond zero mismatch, as proposed at the Crystal City conference. We used a logistic regression model to evaluate recipient characteristics associated with receipt of expanded organs under the current allocation system. The proportion of procured kidneys that were never transplanted (i.e., discards) was also calculated.

RESULTS

The distribution of donor and recipient factors of interest is provided in Table 1. All factors listed in this are included in all subsequent statistical models. A Cox model (not shown) of time to graft failure or death indicated that donor age groups 10 to 19 years, 20 to 29 years, and 30 to 39 years had very similar risks of failure. Therefore, these groups were combined into one reference donor age group. Four donor characteristics were independently associated with a significantly increased risk: older age (40 years or older) or age younger than 10 years, impaired renal function by creatine greater than 1.5 mg/dL, history of hypertension, and CVA as the cause of donor death. A history of diabetes mellitus was present in 2.9% of donors but was not significantly associated with graft failure in the adjusted analyses. Among the causes of death, CVA accounted for 39% of deaths and was significantly associated with poorer graft survival (adjusted RR=1.14). Among the other causes of donor death, head trauma and anoxic brain death accounted for 48.9% and 9.4% of deaths, respectively, but were not significantly associated with graft outcomes. Blacks versus all other donor races and female versus male donor gender were associated with increased graft loss (RR=1.25, P <0.0001 and RR=1.08, P =0.0035, respectively). These factors were not considered as relevant variables in this study based on practical considerations.

T1-14
Table 1:
Description of donor and recipient factors

The distribution of transplant recipients across the four relevant donor characteristics is shown in Table 2. Correspondingly, Table 3 gives the observed RR of graft failure for recipients according to the same strata of donor characteristics shown in Table 2. The display of the magnitude of RR in Table 3 permits identification of different potential cutoff points for RR for defining organs with relatively high risk of graft failure based on donor characteristics. RR values above 1.7 are highlighted in bold, but the could be used to choose alternate cutoff points. For example, using a lower RR cutoff of greater than 1.5 yields 27.4% of patients receiving donor kidneys categorized as high risk. A higher RR cutoff of greater than 2.0 results in categorizing only 7.5% of recipients similarly. Using an intermediate RR level of greater than 1.7 yields 14.8% of recipients as having received kidneys at an elevated risk of graft loss. Using the greater than 1.7 cutoff point almost doubles the size of the expanded donor pool compared with using donor age older than 60 years as the sole criterion for determining risk, which results in only 8.5% of recipients being categorized as having received high-risk kidneys.

T2-14
Table 2:
Number of recipients (and percent) by four donor characteristics
T3-14
Table 3:
Relative riska of graft loss by four donor characteristics

Serum creatinine greater than 1.5 mg/dL was associated with a 10% higher risk of graft failure (RR=1.10, P =0.04). Other levels of serum creatinine were also explored and found less useful for determining risk of graft failure (serum creatinine >1.3 mg/dL [RR=1.06, P =0.15] and serum creatinine >2.0 mg/dL [RR=1.01, P =0.95]). Creatinine clearance (<60 mL/min), as estimated by the Cockcroft-Gault formula, yielded essentially the same RR estimates (RR=1.10, P =0.02) when entered into the model in place of the serum creatinine.

The fields that contain bolded numbers in Table 3 provide an example of a decision matrix based on RR greater than 1.7 to define an expanded donor pool. All donors older than 60 years and those aged 50 to 59 years who had two or more additional donor risk factors fell into this category. The elevated risk estimates for some young pediatric donors (age <10 years) are less precise, being based on small numbers (Table 2).

To allow better interpretation of the impact of relative mortality risk on short-term and longer-term graft survival, Table 4 presents survival percentages at 3 months and at 1 and 3 years based on the same Cox models. The results are shown for all patients with RR greater than 1.7 and also for subgroups of RR of 1.7 to 2.0, RR of 2.0 to 2.5, and for the small group with RR greater than 2.5. Evaluation of the graft loss risk pattern for CVA as cause of death versus all other causes showed that the adjusted risk of graft loss was not different for the first 2.5 months but was subsequently increased for the CVA group.

T4-14
Table 4:
Graft survival (death censored) at 3 months, 1 year, and 3 yearsa

Organs procured for transplantation were substantially more likely to be discarded when their donor characteristics suggested an increased risk of graft loss. Of organs with a RR greater than 1.5, 28.7% were discarded, whereas 38.2% of those with RR greater than 1.7 and 46.0% of those with RR greater than 2.0 were discarded. These percentages were substantially higher than the 9% rate of discard for all donor organs with RR less than 1.7, and the 4.5% rate of discard for the lowest risk group (RR=1.0, the reference group for the risk analyses).

To test whether other factors may have contributed to the discard of expanded donor organs, we evaluated the outcomes among pairs of expanded donor kidneys with discarded mates versus those with transplanted mates. Of 3680 expanded organs that were transplanted, 1058 had the mate discarded. The risk of failure for the transplanted single mates compared with the remainder of expanded donor kidneys where both kidneys were transplanted was significantly greater (RR=1.22, P =0.014). Because this model was adjusted for all relevant donor and recipient characteristics, including CIT, these factors did not explain the increased risk. Therefore, it seems that additional factors were recognized clinically that led to the discard of the mate kidney.

Recipients of expanded donor organs, here defined by RR greater than 1.7, exhibited several significant characteristics by multivariate analysis when compared with all other recipients. As shown in Table 5, such recipients were more likely older (on average, by 6.3 years), more likely white, non-Hispanic, female, and diabetic, with a PRA less than 10. When CIT was added to the model, these recipients were significantly more likely to have a longer CIT, averaging 21.2 hr versus 20.2 hr. Other odds ratios essentially did not change when CIT was added to the model.

T5-14
Table 5:
Predictors of receiving an expanded versus nonexpanded kidneya

Additionally, we addressed the question of whether expanded donor organs carried a greater risk for older versus younger recipients and found a significant interaction for expanded kidneys by recipient age (P =0.003). When applying the RR of graft failure greater than 1.7 to define an expanded kidney, the RR for recipients older than 55 years was 1.77 compared with a RR of 1.58 for recipients younger than age 50. Correspondingly, the effect of recipient age (≥50 years versus <50 years) on graft failure was greater for recipients of expanded kidneys (RR=1.30) than for recipients of nonexpanded kidneys (RR=1.14). Therefore, expanded kidneys had a greater negative impact on older recipients. We evaluated the use of double, en block transplants and found that these were used more frequently for expanded donor kidneys than all others (5.7% versus 2.6%). This low frequency and small difference, however, did not permit detailed analysis. Additionally, concerns of confounding by indication may limit interpretation of outcomes for double, en block transplants.

We also evaluated the effect of excluding expanded donor organs from HLA matching and sharing in order to reduce CIT, as proposed at the 2001 Crystal City meeting. We first determined that the risk associated with CIT was clearly elevated above 24 hr for expanded organs and similarly for all other renal transplants. (Data not shown.) Below 24 hr of CIT, the risk, however, did not vary significantly, even when comparing less than 6 hr with greater than 18 to 24 hr. Table 6 provides the distribution of transplants and number of actual graft losses among 3680 expanded donor transplant recipients by groupings of CIT above and below 24 hr, zero versus 1 to 6 MM, and shared versus local distribution. In addition, the shows the expected number of graft losses if allocation without regard to HLA and CIT less than 24 hr had been used. (This does not include shared 1–6 MM organs because this proposal assumes no sharing.) As expected, the 1 to 6 MM–local–less than 24 hr CIT group was the largest, and the outcomes for this group would not be affected by the new policy proposed at Crystal City. However, distributing all zero MM organs without regard to HLA matching would have led to 16.7 additional graft losses, even if sharing was eliminated and CIT was always less than 24 hr (i.e., 138.7 expected versus 122 observed, lines 1–4 in Table 6). This adverse effect of the proposed policy would be balanced by 25.2 fewer graft losses through shortened CIT (line 6 in Table 6). The net reduction by 8.6 fewer graft failures is based on 958 observed failures and translates to less than 1% fewer failures or 90.1% instead of 90% graft survival.

T6-14
Table 6:
Estimated annual effect of proposed policy for expanded donors (RR>1.7)

DISCUSSION

This study identifies four simple donor characteristics that are significantly and independently associated with a substantially increased risk of failure of cadaveric kidney transplants. Based on risk assessment findings, a decision matrix can be developed to identify organs for expanded donor candidates. The RR level used for this assessment may be selected from the matrix in Table 3. We suggest here a RR cutoff level of greater than 1.7, that is, a 70% higher risk of graft failure. This means that 17 graft failures in this group would be observed compared with 10 for the lowest risk group, resulting in 1-year graft survival rates of approximately 83% versus 90%, respectively. Using this RR level, 14.8% of cadaveric kidneys actually transplanted during 1995 to 2000 belonged to this category. Choosing a lower RR, for example, greater than 1.5, would yield a higher percentage (27.4% versus 14.8%) of transplant recipients with expanded organs. The lower RR also has a smaller impact on graft loss. By contrast, choosing RR greater than 2.0 would result in a greater impact on graft outcome but affect a smaller percentage (7.5%) of recipients.

Other researchers have chosen donor age older than 60 years or other algorithms to simplify identification of organs with elevated risk of transplant failure (6). The algorithm proposed here is based on four readily available variables: donor age, CVA as cause of donor death, serum creatinine level, and history of hypertension, each being significantly and independently associated with substantially increased risk of graft loss. Other factors that were considered in this analysis were not found to be significant, such as diabetes in the donor, duration of diabetes, or duration of prior hypertension. Serum creatinine level greater than 1.5 mg/dL has been proposed here as a criterion, despite its known limitations, because of its simplicity for quick assessment during organ retrieval. Additionally, formulae for estimation of renal function from serum creatinine require a stable steady-state creatinine level. In the premortal state, however, stability of serum creatinine level cannot be assumed.

Outcomes for these organs are, of course, less positive than for lower risk organs, as shown by Alfrey et al. (6) and Ojo et al. (5). Ojo et al. (5), however, have shown that the survival of recipients of expanded donor kidneys, defined somewhat differently, is still superior compared with the survival of the relevant reference group of similar dialysis patients on the transplant waiting list. Although it takes longer to achieve cumulatively better outcomes with expanded kidneys than with transplants from more ideal donors, better outcomes are observed within 1.5 years of transplantation. Therefore, one may conclude that transplant candidates with a life expectancy of greater than 1.5 years on dialysis would benefit from transplants involving expanded kidneys.

An upper level of risk at which transplantation should be avoided was not explored here. The analyses of Ojo et al. (5) are encouraging in this context because they document a benefit from transplantation even though the RR levels of patients older than 60 years tend to be higher than for the high risk groups of patients aged 50 to 59 years included here (Table 3). However, in the current study, the highest risk group of donors older than 60 years with all three additional risk factors (RR=2.69) deserves caution and further study. Of note, the sample size for this group was very small, comprising only 0.5% of all transplanted kidneys. A potential limitation of the present study is the exclusion of 20.9% of patients because of missing information on some of the donor characteristics. The 79% of patients included here, however, provide correlational evidence based on the large majority of U.S. transplant recipients.

The Crystal City proposal considered the relative impacts of HLA matching and CIT on outcomes for expanded donor kidneys. Our adjusted analyses show that the benefits of a shorter CIT slightly outweigh the benefits of HLA matching, if CIT can be kept below 24 hr for all expanded donor kidneys. This small difference (∼1% overall change in graft survival at 1 year) may be considered to be essentially neutral. The previously observed correlation of CIT with graft failure (10) was only observed for CIT longer than 24 hr in the current study. This may be related to the use of better organ preservation methods than those used a decade earlier.

One major benefit of the Crystal City proposal is the potential to reduce the discard rate of retrieved organs, given that the patient’s willingness to accept an elevated risk of graft failure would be known in advance. The magnitude of reduction in discards may be estimated as an upper bound. Our analysis suggested that the discard rate for expanded donors could be reduced from the current 39% to a level closer to the 9% observed for all other donors. Another benefit of this proposal would be a cost reduction by reducing the shipping of expanded donor organs.

The outcomes for expanded kidneys with discarded mates were significantly worse than the outcomes for expanded kidneys with transplanted mates. This finding suggests that clinicians were able to use additional as-yet-unquantified parameters to estimate the prognoses and reject (discard) organs with relatively poor predicted outcomes. It is also possible that information of poor intraoperative performance of the first kidney was shared with the team planning the transplant of the mate, which may have led to discarding of that organ. Nevertheless, the benefit of a greater utilization of these procured organs is shorter waiting times and a corresponding reduction in the number of deaths on the waiting list. Another benefit shown in recent studies is a lower risk of graft failure with shorter time on dialysis (7,11). Thus, a reduction in waiting time will partly balance the increased risk associated with expanded donor organs.

The present study also evaluated whether expanded donor kidneys should be given preferentially to older patients. Evidence provided in this study does not support this notion. It suggests, however, the importance of informed patient choice of being placed in the expanded donor criteria pool.

The experience in Spain over the last decade shows that cadaveric donor kidney transplantation can be increased sufficiently for waiting times to actually decrease (4,8). Acceptance of kidneys from older donors in Spain is nearly twofold higher than in the United States per million population, whereas procurement and transplantation rates using younger donors are virtually the same in both countries. The successes in Spain are certainly encouraging and warrant further consideration.

Economic issues were not addressed in the current study. It is clear, however, that substantial resources are wasted to procure organs that subsequently are discarded. Common wisdom about the cost to procure an organ ranges from $25,000 to $50,000. A discard rate of nearly 40% for expanded donor kidneys is an important and potentially modifiable cost consideration. In addition, there are the costs of alternative treatment for those patients who remain in dialysis without a transplant. For example, during the period of this study, a reduction from 40% to 20% in the discard rate would have had a major cost benefit and enabled approximately 2000 additional patients to have received a transplant. The differential in direct medical cost to Medicare between dialysis and a functioning transplant is approximately $26,000 per year (12). The implied cost savings for 2000 functioning transplants would be about $52 million after the first year. (The first year cost, however, is much higher.) Without putting any monetary value on the improved quality of life and the added life years with a functioning graft compared with dialysis, the social benefit of additional transplants is substantial, both because of the procurement costs that go for unused organs and the lower cumulative direct medical cost of shifting patients from dialysis to transplantation.

CONCLUSION

Cadaveric kidney transplantation may be enhanced by reducing the discard rate of procured organs. Although the magnitude of this increase in expanded donor kidney transplants is difficult to predict, we speculate that about 10% more kidneys could be transplanted than at present. Knowing in advance which patients are willing to receive kidneys known to have an elevated risk of failure is expected to lead to faster placement and transplantation of such kidneys. The impact of this approach will need to be reassessed after a substantial increase in transplantation of expanded donor kidneys has occurred. Perhaps additional donor factors can be considered in the future such as the premortal trend in serum creatinine level. Future studies also will need to evaluate donor factors that identify kidneys with predicted outcomes so poor that they should not be procured.

REFERENCES

1. United States Renal Data System. USRDS 1999 annual data report. Am J Kidney Dis 1999; 34 (suppl 1): S1.
2. UNOS. 2000 OPTN/SRTR annual report 1990–1999: HHS/HRSA/OSP/DOT; UNOS, 2000.
3. Ojo AO, Wolfe RA, Leichtman AB, et al. A practical approach to evaluate the potential donor pool and trends in cadaveric kidney donation. Transplantation 1999; 67: 548.
4. Miranda B, Fernandez Lucas M, de Felipe C, et al. Organ donation in Spain. Nephrol Dial Transplant 1999; 14 (suppl 3): 15.
5. Ojo AO, Hanson JA, Meier-Kriesche H, et al. Survival in recipients of marginal cadaveric donor kidneys compared with other recipients and wait-listed transplant candidates. J Am Soc Nephrol 2001; 12: 589.
6. Alfrey EJ, Lu AD, Carter JT, et al. Matching does not improve outcome from aged marginal kidney donors. Transplant Proc 2001; 33: 1162.
7. Meier-Kriesche H, Port FK, Ojo AO, et al. Deleterious effect of waiting time on renal transplant outcome. Transplant Proc 2001; 33: 1204.
8. Miranda B, Gonzalez Alvarez I, Cuende N, et al. Update on organ donation and retrieval in Spain. Nephrol Dial Transplant 1999; 14: 842.
9. SAS Institute Inc. SAS/STAT User’s Guide, Version 8. Cary, NC: SAS Institute Inc., 1999.
10. Held PJ, Kahan BD, Hunsicker LG, et al. The impact of HLA mismatches on the survival of first cadaveric kidney transplants. N Engl J Med 1994; 331: 765.
11. Mange KC, Joffe MM, Feldman HI. Effect of the use or nonuse of long-term dialysis on the subsequent survival of renal transplants from living donors. N Engl J Med 2001; 344: 726.
12. U. S. Renal Data System. USRDS 1996 Annual Data Report. National Institutes of Health, National Institute of Diabetes and Digestive and Kidney Diseases, Bethesda, MD, April 1996:129.
© 2002 Lippincott Williams & Wilkins, Inc.