Journal Logo

Commentaries

Mitigating Risk of Immunosuppression by Immune Monitoring

Are We There?

Bentall, Andrew MBBCh, MD1; Amer, Hatem MD1

Author Information
doi: 10.1097/TP.0000000000002037
  • Free
  • Social Media Collection

The advent of the calcineurin inhibitors (CNIs), cyclosporine and tacrolimus, heralded a new era in transplantation. The introduction of cyclosporine in the late 1980s reduced the rejection rates in kidney allograft recipients from approximately 80% to approximately 40% with a concomitant improvement in graft survival to more than 80% in the first year after transplant.1 This opened a new era in transplantation, allowing kidney transplantation to become a more mainstream option for patients with end-stage renal failure. It also allowed the development and expansion of other solid organ transplants including liver, heart, lungs, and intestine. They also allowed for the first time successful transplantation of skin containing vascularized composite organs such as upper extremities and face.

The benefits of reduced rejection rates and improved early allograft survival were met with a challenge in long-term graft survival.2 Calcineurin inhibitor toxicity is considered an impediment to long-term survival for both kidney and other organ transplant recipients.3

In our own program, in response to a high prevalence of polyoma virus nephropathy, a modest decrease of tacrolimus target levels by 2 ng/mL resulted in a significant decrease in polyoma virus nephropathy and beneficial effects on metabolic profiles and renal allograft histology (Figure 1).4 This benefit was not without consequences. There was a numerical increase in rejection rates. This highlighted the importance of finding a method of determining which recipients would benefit from a decrease in immunosuppression without the increased risk of rejection. We, as in other programs, used pretransplant characteristics in an attempt to achieve the goal of reaping the benefit of reduced immunosuppression without increasing the risk of acute rejection or increasing the rate of de novo donor-specific antibodies. Currently, this has resulted in the application of 4 different immunosuppressive regimens deployed at the time of transplantation based on age, presence of preformed donor-specific antibodies, and HLA matching with annual monitoring for donor specific alloantibody using a solid phase assay. The long-term effectiveness of this strategy is being assessed.

FIGURE 1
FIGURE 1:
Severity of interstitial fibrosis, tubular atrophy, and arteriolar hyalinosis in 1-year protocol biopsies from patients in the HiTAC and LoTAC groups. Data are expressed as percent of biopsies with no changes (chronic interstitial fibrosis [ci] score, 0; open bars), mild changes (ci score, 1; light gray bars), moderate changes (ci score, 2; darker gray bars), or severe changes (ci score, 3; black bars). Compared to HiTAC, LoTAC had less severe fibrosis (P = 0.0001, Mann-Whitney), less tubular atrophy (P = 0.001), and less hyalinosis (P = 0.05). Adapted from Cosio et al.4

It is thus evident that there is a need for an assay that can monitor the degree of immunosuppression, assessed by the reactivity of either T and B cells, or subsets thereof, on a regular basis, before the development of cellular rejection, donor specific alloantibody, or the adverse effects of immunosuppressive agents or immunosuppression itself.

Assays using donor-specific antigen to monitor immune responsiveness to the allograft are limited in their applicability in wider clinical contexts5; however, the use of nonpolymorphic HLA-derived peptides may allow assessment of lymphocyte responses independent of donor HLA peptides.6 Many investigators use (interferon gamma) production of T cells (either whole population or specific subsets).

Monitoring of microRNA has been used to describe changes in allograft histology, acute rejection events, and prediction of malignancy; however, significant overlap occurs between microRNA detection and clinical events limiting their applicability.7

Another method of monitoring the degree of immunosuppression is by monitoring the transcriptional activity of nuclear factor of activated T cells (NFAT)-regulated genes in peripheral blood.8 This method uses real-time polymerase chain reaction to quantify the expression of the NFAT-regulated genes of interleukin 2, interferon gamma, and granulocyte-macrophage colony-stimulating factor. Residual gene expression is calculated as the percent of postdrug peak to the baseline. Target values for NFAT residual expression (NFAT-RE) are believed to be between 15% and 30%. Lower values indicate over immunosuppression with greater risk of opportunistic infections. Higher values represent underimmunosuppression and increased risk of rejection.

In this issue, Sommerer et al9 present the results of a randomized trial comparing standard pharmacokinetic dosing of cyclosporine to NFAT-RE–guided dosing in stable prevalent renal transplant recipients. The primary end point was the change of arterial stiffening at 6 months assessed by pulse wave velocity (PWV), as surrogate marker of vascular disease.

The baseline data are interesting in that it demonstrated that prevalent kidney transplant recipient maybe overimmunosuppressed with an NFAT-RE value of less than 15%. In the intervention arm, the NFAT-RE value increased and cyclosporine levels decreased; at 12 months, the cyclosporine levels were lower in the intervention arm although not statistically significant. This was 6 months after the end of the active intervention phase. The trial met its primary efficacy end point with favorable PWV in the intervention arm, indicating a “healthier” arterial system. Acknowledging the multifactorial effect on vascular outcomes in patients with kidney disease, there may be unmeasured confounding variables to the reduction in PWV; however, the known cardiovascular adverse effect of CNI therapy in this study is improved.

Important findings reported in this trial are that the use of NAT-RE–directed dosing of cyclosporine averted the development of opportunistic infections; and, despite lower cyclosporine levels, there were fewer rejections. No opportunistic infections occurred in the intervention arm and the 1 rejection that occurred was in an individual who did not increase the cyclosporine when indicated.

Are we at the point of using NFAT-RE in routine clinical practice for dosing cyclosporine? Unfortunately, this is not the case. The target of 15% to 30% NFAT-regulated gene expression also needs to be correlated with age and relative senescence as transplant recipients get older with reduced immune responses. It will be important to assess the sensitivity and specificity of this technique in predicting overimmunosuppression as assessed by the development of opportunistic infections and underimmunosuppression represented by acute rejection, de novo donor-specific antibody development, and the presence of subclinical rejection. Thus, future larger multicenter trials are needed to validate the findings and assess these outcomes. It will be important to see how this technique can be implemented on a wider scale. To determine NFAT-RE, measurements have to be obtained at trough and 2-hour postdose intervals. This may prove too cumbersome for patients in routine practice. We have to remember that C2 cyclosporine measurements are not routinely done owing to this issue. Cyclosporine, the CNI used in this trial, is currently not the most widely used CNI. Future trials will need to assess the use of this technique in monitoring tacrolimus therapy. There are preliminary data alluding to its benefit, and it is unclear why it was not used in this trial.10 Finally, this assay only assessed 1 component of the combination immunosuppression regimen and did not assess the implication of the other components nor the effect on the humoral arm of the immune response.

REFERENCES

1. Zand MS. Immunosuppression and immune monitoring after renal transplantation. Semin Dial. 2005;18:511–519.
2. Lamb KE, Lodhi S, Meier-Kriesche HU. Long-term renal allograft survival in the United States: a critical reappraisal. Am J Transplant. 2011;11:450–462.
3. Nankivell BJ, Borrows RJ, Fung CL, et al. The natural history of chronic allograft nephropathy. N Engl J Med. 2003;349:2326–2333.
4. Cosio FG, Amer H, Grande JP, et al. Comparison of low versus high tacrolimus levels in kidney transplantation: assessment of efficacy by protocol biopsies. Transplantation. 2007;83:411–416.
5. Hricik DE, Rodriguez V, Riley J, et al. Enzyme linked immunosorbent spot (ELISPOT) assay for interferon-gamma independently predicts renal function in kidney transplant recipients. Am J Transplant. 2003;3:878–884.
6. Smith HJ, Hanvesakul R, Bentall A, et al. T lymphocyte responses to nonpolymorphic HLA-derived peptides are associated with chronic renal allograft dysfunction. Transplantation. 2011;91:279–286.
7. Sarma NJ, Tiriveedhi V, Ramachandran S, et al. Modulation of immune responses following solid organ transplantation by microRNA. Exp Mol Pathol. 2012;93:378–385.
8. Sommerer C, Meuer S, Zeier M, et al. Calcineurin inhibitors and NFAT-regulated gene expression. Clin Chim Acta. 2012;413:1379–1386.
9. Sommerer C, Brocke J, Bruckner T, et al. Improved pulse wave velocity and renal function in individualized calcineurin-inhibitor treatment by immunomonitoring: the randomized controlled Calcineurin Inhibitor-Sparing (CIS) Trial. Transplantation. 2018;102:510–520.
10. Sommerer C, Zeier M, Meuer S, et al. Individualized monitoring of nuclear factor of activated T cells–regulated gene expression in FK506-treated kidney transplant recipients. Transplantation. 2010;89:1417–1423.
Copyright © 2018 Wolters Kluwer Health, Inc. All rights reserved.