With increased interest in tolerance and the use of lower immunosuppression (IS) (1–4), detecting the breakdown of tolerance has become important because an indication of inadequate IS before actual failure of the graft is obviously vital. In addition, when patients are being weaned from IS, early warning of danger in those who require more drugs is equally vital. We suggest here that monitoring the formation of donor-specific antibodies (DSA) to human leukocyte antigen (HLA) antibodies is a simple telltale of escape from tolerance. The appearance of HLA-specific antibody before chronic organ failure (especially DSA) has been reported in many studies (5, 6). Once antibodies are detected, their reduction has been associated with long-term renal allograft survival (7). Therefore, early detection of HLA antibodies is a simple and effective way to prevent graft failure that may result from inadequate IS.
Seventy-two patients (93% men; mean age, 29.2±8.3 years) who did not have preexisting DSA, and who were receiving stable medication, were enrolled in this study. Relationships with donor, the number of HLA-A, B, or DR mismatch (ABDR mismatch) mean follow-up, and outcome are summarized in Table 1.
As shown toward the bottom of Table 1, 35 patients (49%) developed DSA, which were detected at some point during the observational periods. Of these patients, 43% developed class I DSA, 46% class II, and 11% developed both class I and class II appearing at the same time. The mean duration from transplantation to appearance of DSA (D0) was 9.4±6.3 months. The proportion and type of DSA differed across the three maintenance IS dose groups: The proportions of patients experiencing DSA in the low-, middle-, and high-dose groups were 69%, 58%, and 13%, respectively (P<0.001).
The cumulative percent of DSA development is shown in Figure 1. Overall, DSA developed after transplantation in 17% of the patients at 6 months (95% confidence limits [CL] 10%, 28%), 41% at 1 year (30%, 55%), and 57% at 2 years (44%, 71%). Patients treated with total lymphoid irradiation had a DSA production rate of 58% (95% CL: 41, 76%) at 2 years compared with 56% (37, 77%) for those who received bortezomib.
DSA appearance after recent IS weaning is shown in Figure 2. Cumulative risks of positive DSA after recent IS weaning are shown for the three different drug levels. The 1-year rates of DSA production were 80% (95% CL: 60, 94%), 71% (49, 90%), and 55% (20, 74%), among low-, middle-, and high-dose group, respectively. The development of DSA after adjusting for patient’s age and gender, donor’s age and gender, time after transplantation, and ABDR mismatch is given for the different doses. Note that, again, the lower the dose of drugs, the higher the rate of DSA production. In addition, probability of biopsy-proven humoral rejection at 1 year after weaning IS was higher in patients with DSA development than in non-DSA patients (20% vs. 4%, P=0.047, log-rank test).
Time factors affecting appearance of DSA are shown in Table 2. The results of three different log-linear regression models are analyzed to show which time factor is most closely associated with the appearance of DSA after adjusting for recipient gender, age, ABDR mismatch, prednisone dose, and use of other IS. As time variables, model 1 included D0 (duration in years between transplantation and DSA appearance), model 2 included D1 (years from recent IS weaning to DSA appearance), and model 3 included D2 (years from previous IS weaning to DSA appearance). The risk ratio of D0 was 0.70 (95% CL: 0.49, 1.00), D1 0.82 (0.48, 1.39), and D2 0.76 (0.49, 1.25). With these models, D0 showed a slightly higher association with appearance of DSA than the other time variables. This suggests that time after transplantation could be more important for DSA appearance than the other time factors, with the reasonable inference that time after transplantation is key to determining the amount and frequency of HLA monitoring adequate for detecting breakdown of tolerance or need for increased IS during weaning in time to prevent graft loss.
There have been many attempts to reduce the need for IS, for example, by clonal deletion and to reduce the level of IS by weaning immediately after transplantation (8, 9) or after some period of stabilization (10–13). The benefits are curbing drug-related adverse events, decreasing risk of infection, and lowering medical costs, but the risks are acute rejection and allograft loss that can occur without warning (9).
There are safeguards—methods to detect the onset of immune response in time to prevent graft loss. But most of these methods are in some measure flawed. Biopsies are commonly used but are time-consuming, relatively expensive, and hard on the patient. Tests for cellular immunity can also be considered, although most are too cumbersome for regular use. Most of the standard measures of function, such as serum creatinine, are not sensitive enough to detect damage because the kidney compensates and can maintain normal creatinine levels despite losing considerable portions of its nephrons. Indeed, there are some possible predictors for early acute kidney injury, such as neutrophil gelatinase-associated lipocalin (14, 15), kidney injury molecule-1 (16, 17), liver fatty acid-binding protein (18, 19), and Cystatin C (20), but they have not been finalized for clinical use.
Testing for DSA has none of these disadvantages. It is simple, rapid, cost-effective, easy on the patient, and provides the early warning needed. Many studies have shown that although creatinine values are normal, once DSA develops grafts are eventually lost to rejection (5, 6, 21–25). So appearance of DSA is the key adverse sign. The British Society for Histocompatibility & Immunogenetics and British Transplantation Society have recommended sending patient serum samples to the histocompatibility laboratory no less than 3 monthly for routine antibody monitoring for kidney and pancreas transplantation (26). Nonetheless, this study suggests that more frequent routine antibody monitoring should be considered in low IS regimens and in case of weaning of IS. Cooper et al. (27) reported that they failed to predict acute rejection with screening at 1, 6, 12, and 24 months after transplantation with standard maintenance IS protocol, and 62% and 91% of DSA were detected within 1 and 6 months after transplantation, respectively. Similar results were reported by Gill et al. (55% and 73%, respectively) (28). Therefore, monthly screening for DSA for at least the first 6 months after transplantation may be adequate. In addition, more frequent test may be needed within the first month after transplantation or after weaning IS. The current data suggest a monthly schedule, which we had followed for the past 4 years. However, this schedule may not necessarily apply to other patients who are under higher IS protocols. The current patients were on extremely low dose of drugs when compared with the usual patients.
We observed an association of the appearance of DSA with IS level, with the risk of DSA development greater and coming earlier at low dosage. Thus, the risk of DSA development is inversely proportional to drug dosage, implying that greater care must be exercised with low doses of drugs. Although this is not unexpected, this study shows that DSA can be readily detected and that its detection serves as early warning when the patient has escaped from a state of tolerance—extremely valuable in preventing ultimate failure of the graft because several studies now show that prompt reduction of the antibodies results in better graft survival (7, 29). With DSA providing advanced warning, patients can more safely be moved toward weaning procedures.
To our knowledge, this is the first study showing the rate of DSA after reduced levels of IS. For patients undergoing standard treatment (e.g., triple-therapy regimen), the rate of HLA antibody formation has been variously reported to be 12% to 60% before the year 2000 (30). Similar rates have been reported in this decade, 27.2% (31) and 16% (28, 32). In addition, the rates of DSA formation have been variously reported to be 2.5% (32), 11% (28), and 24% (33). Most of DSAs were observed less than 3 months after transplantation (28, 33). Berga et al. reported very low incidence of DSA (2.5%) after kidney transplantation. The incidence of DSA varies depending on the IS levels and time after transplantation as well as other factors such as histoincompatibility and presensitization. Therefore, the rates of DSA development can be expected to vary between reports. It is reasonable to assume that the rate of DSA could be higher after weaning of IS than standard triple-drug treatment. In the tolerance studies of Kawai et al. (34), the rate of positive DSA was 3 of 5 (60%) at 4 to 7 years, a rate similar to that of our protocol. This confirms that DSA appears as stated. As cited above, elimination of the antibodies is associated with longer graft life; thus, tolerance induction protocols would benefit from DSA monitoring.
In conclusion, monitoring for DSA seems to be a simple, effective means to detect deterioration of tolerance. Because DSA may be present for some time without permanent damage, it may be safe to monitor on a monthly basis. Reduction in IS therapy also requires a means to determine whether reduction has gone too far. Again, DSA monitoring can provide a simple warning of individual patients’ need for more medication.
MATERIALS AND METHODS
The clonal deletion protocol and patients’ informed consent forms were approved by the institutional review board. All patients provided fully informed consent and received a transplant from a living donor at the Institute of Kidney Disease and Research Centre-Institute of Transplantation Science in Ahmedabad, India. All included patients were given clonal deletion, then those who were stabilized later with some level of IS were weaned off at various stages in their recovery. Patients whose maintenance IS doses changed frequently (in <1 month) were excluded from the study. Clonal deletion protocol was of two types: total lymphoid irradiation (35) and treatment with bortezomib (36, 37).
Appearance of DSA After Transplantation
In addition to the usual monthly physical and laboratory examination, including urine volume, body temperature, serum creatinine, hemoglobin, and medication status, all patients’ serum samples were tested for the presence of DSA.
For weaning studies, recipients’ daily IS doses at the last observation were categorized as low dose (no IS or <10 mg of prednisone), middle dose (10 mg of prednisone or more, but <20 mg), and high dose (20 mg of prednisone and/or combination of prednisone and other IS [mycophenolate modetil, tacrolimus, cyclosporine, or silorimus]). The appearance of DSA after weaning was monitored to determine differences in the rate of appearance depending on IS dose category. Note that these categories are slightly different from those cited above because of the different purposes served.
Factors Affecting Appearance of DSA
Examining every longitudinal serum sample of every patient, we determined the time of DSA appearance (TDSA). Every patient was followed until TDSA or on April 30, 2011, whichever comes first. Next, we calculated the duration from date of transplantation to TDSA or end of observation (D0), the duration from the date of current drug weaning to TDSA (D1), and the duration from the date of previous drug weaning to TDSA (D2) (Fig. 3) and then we compared those three durations using multiple log-linear regression models that included recipient gender, age (in 10 years), number of HLA mismatches (ABDR), prednisone dose (per 2.5 mg/day), and use of other IS.
Measurement of DSA
Detection of HLA antibodies was performed using LABScreen Mixed and/or Single Antigen beads (One Lambda Inc., Canoga Park, CA). The assay was performed according to the manufacturer’s protocol. Mixed antigen beads were used as the first screening test followed by single antigen test for the development of DSA. Any normalized Median Fluorescence Intensities (MFI) over 1000 was considered positive for HLA antibodies. Patients were followed biweekly for the first 3 months and monthly for another 3 months. After those 6 months, patients were examined monthly or every 2 months.
Data were summarized using proportions and means ± standard deviation as appropriate. Categorical variables were analyzed with the chi-square test or Fisher’s exact test as appropriate, and continuous variables were compared using analysis of variance. Cumulative event risks were estimated with Kaplan-Meier survival curves and compared using the log-rank test. The Cox proportional hazards model was used to obtain hazard ratios and 95% CL for appearance of DSA. Because the frequency of the outcome is too high to interpret logistic regression coefficients as log relative risks, log-linear risk models (38) were used instead to estimate the risk ratio and 95% CL for appearance of DSA based on recipient’s sex, age, ABDR mismatch, and maintenance drug dose (prednisone dose and use of other IS). All analyses were carried out using Stata version 10.1 (39).
The authors thank Dr. H.L. Trivedi for providing data on his patients for this analysis.
1. Orlando G, Hematti P, Stratta RJ, et al.. Clinical operational tolerance after renal transplantation
: Current status and future challenges. Ann Surg 2010; 252: 915.
2. Sayegh MH, Remuzzi G. Clinical update: Immunosuppression minimisation. Lancet 2007; 369: 1676.
3. Vincenti F. Immunosuppression minimization: Current and future trends in transplant immunosuppression. J Am Soc Nephrol 2003; 14: 1940.
4. Hricik DE. Steroid-free immunosuppression in kidney transplantation
: An editorial review. Am J Transplant 2002; 2: 19.
5. Mizutani K, Shibata L, Ozawa M, et al.. Detection of HLA and MICA antibodies before kidney graft failure. Clin Transpl 2006; 255.
6. Ozawa M, Rebellato LM, Terasaki PI, et al.. Longitudinal testing of 266 renal allograft patients for HLA and MICA antibodies: Greenville experience. Clin Transpl 2006; 265.
7. Everly MJ, Rebellato LM, Ozawa M, et al.. Beyond histology: Lowering human leukocyte antigen antibody to improve renal allograft survival in acute rejection. Transplantation
2010; 89: 962.
8. Ekberg H, Bernasconi C, Tedesco-Silva H, et al.. Calcineurin inhibitor minimization in the Symphony study: Observational results 3 years after transplantation
. Am J Transplant 2009; 9: 1876.
9. Vincenti F, Ramos E, Brattstrom C, et al.. Multicenter trial exploring calcineurin inhibitors avoidance in renal transplantation
2001; 71: 1282.
10. Ekberg H, Grinyo J, Nashan B, et al.. Cyclosporine sparing with mycophenolate mofetil, daclizumab and corticosteroids in renal allograft recipients: The CAESAR Study. Am J Transplant 2007; 7: 560.
11. Vincenti F, Schena FP, Paraskevas S, et al.. A randomized, multicenter study of steroid avoidance, early steroid withdrawal or standard steroid therapy in kidney transplant recipients. Am J Transplant 2008; 8: 307.
12. Woodle ES, First MR, Pirsch J, et al.. A prospective, randomized, double-blind, placebo-controlled multicenter trial comparing early (7 day) corticosteroid cessation versus long-term, low-dose corticosteroid therapy. Ann Surg 2008; 248: 564.
13. van de Wetering J, Gerrits JH, van Besouw NM, et al.. Successful tapering of immunosuppression to low-dose monotherapy steroids after living-related human leukocyte antigen-identical renal transplantation
2009; 87: 740.
14. Mishra J, Dent C, Tarabishi R, et al.. Neutrophil gelatinase-associated lipocalin (NGAL) as a biomarker for acute renal injury after cardiac surgery. Lancet 2005; 365: 1231.
15. Haase M, Bellomo R, Devarajan P, et al.. Accuracy of neutrophil gelatinase-associated lipocalin (NGAL) in diagnosis and prognosis in acute kidney injury: A systematic review and meta-analysis. Am J Kidney Dis 2009; 54: 1012.
16. Han WK, Waikar SS, Johnson A, et al.. Urinary biomarkers in the early diagnosis of acute kidney injury. Kidney Int 2008; 73: 863.
17. Vaidya VS, Ozer JS, Dieterle F, et al.. Kidney injury molecule-1 outperforms traditional biomarkers of kidney injury in preclinical biomarker qualification studies. Nat Biotechnol 2010; 28: 478.
18. Portilla D, Dent C, Sugaya T, et al.. Liver fatty acid-binding protein as a biomarker of acute kidney injury after cardiac surgery. Kidney Int 2008; 73: 465.
19. Negishi K, Noiri E, Doi K, et al.. Monitoring of urinary L-type fatty acid-binding protein predicts histological severity of acute kidney injury. Am J Pathol 2009; 174: 1154.
20. Bonventre JV, Vaidya VS, Schmouder R, et al.. Next-generation biomarkers for detecting kidney toxicity. Nat Biotechnol 2010; 28: 436.
21. Singh N, Djamali A, Lorentzen D, et al.. Pretransplant donor-specific antibodies
detected by single-antigen bead flow cytometry are associated with inferior kidney transplant outcomes. Transplantation
2010; 90: 1079.
22. Sanchez-Fructuoso AI, Santiago JL, Perez-Flores I, et al.. De novo anti-HLA antibodies in renal allograft recipients: A cross-section study. Transplant Proc 2010; 42: 2874.
23. Lee PC, Zhu L, Terasaki PI, et al.. HLA-specific antibodies developed in the first year posttransplant are predictive of chronic rejection and renal graft loss. Transplantation
2009; 88: 568.
24. Terasaki PI, Ozawa M. Predicting kidney graft failure by HLA antibodies: A prospective trial. Am J Transplant 2004; 4: 438.
25. Worthington JE, Martin S, Al-Husseini DM, et al.. Posttransplantation production of donor HLA-specific antibodies as a predictor of renal transplant outcome. Transplantation
2003; 75: 1034.
26. Howell WM, Harmer A, Briggs D, et al.. British Society for Histocompatibility & Immunogenetics and British Transplantation
Society guidelines for the detection and characterisation of clinically relevant antibodies in allotransplantation. Int J Immunogenet 2010; 37: 435.
27. Cooper JE, Gralla J, Cagle L, et al.. Inferior kidney allograft outcomes in patients with de novo donor-specific antibodies
are due to acute rejection episodes. Transplantation
2011; 91: 1103.
28. Gill JS, Landsberg D, Johnston O, et al.. Screening for de novo anti-human leukocyte antigen antibodies in nonsensitized kidney transplant recipients does not predict acute rejection. Transplantation
2010; 89: 178.
29. Everly MJ, Everly JJ, Arend LJ, et al.. Reducing de novo donor-specific antibody levels during acute rejection diminishes renal allograft loss. Am J Transplant 2009; 9: 1063.
30. McKenna RM, Takemoto SK, Terasaki PI. Anti-HLA antibodies after solid organ transplantation
2000; 69: 319.
31. Ozawa M, Terasaki PI, Lee JH, et al.. 14th International HLA and Immunogenetics Workshop: Report on the Prospective Chronic Rejection Project. Tissue Antigens 2007; 69 (Suppl 1): 174.
32. Berga JK, Mateu LM, Catalan SB, et al.. Donor-specific HLA antibodies: Risk factors and outcomes after kidney transplantation
. Transplant Proc 2011; 43: 2154.
33. Piazza A, Poggi E, Borrelli L, et al.. Impact of donor-specific antibodies
on chronic rejection occurrence and graft loss in renal transplantation
: Posttransplant analysis using flow cytometric techniques. Transplantation
2001; 71: 1106.
34. Kawai T, Benedict Cosimi A. Induction of tolerance in clinical kidney transplantation
. Clin Transplant 2010; 24 (Suppl 22): 2.
35. Trivedi HL, Kaneku H, Terasaki PI, et al.. Clonal deletion
using total lymphoid irradiation with no maintenance immunosuppression in renal allograft recipients. Clin Transpl 2009; 265.
36. Trivedi HL, Terasaki PI, Feroz A, et al.. Abrogation of anti-HLA antibodies via proteasome inhibition. Transplantation
2009; 87: 1555.
37. Trivedi HL, Terasaki PI, Feroz A, et al.. Clonal deletion
with bortezomib followed by low or no maintenance immunosuppression in renal allograft recipients. Transplantation
2010; 90: 221.
38. Greenland S. Model-based estimation of relative risks and other epidemiologic measures in studies of common outcomes and in case-control studies. Am J Epidemiol 2004; 160: 301.
39. Stata Version 10.1. College Station, TX, Stata Corporation, 2011.