ISSN: 2155-9570
Review Article - (2015) Volume 6, Issue 3
Corneal transplantation stands alone as the most common and successful form of solid organ transplantation. Even though HLA matching and systemic antirejection drugs are not routinely used, 90% of the first time corneal allografts will succeed. By contrast, all other major categories of organ transplantation require HLA matching and the use of systemically administered immunosuppressive drugs. This remarkable success of corneal transplants under these conditions is an example of “immune privilege” and is the primary reason for the extraordinary success of corneal transplantation. A number of dogmas have emerged over the past century to explain immune privilege and the immunobiology of corneal transplantation. Many of these dogmas have been based largely on inferences from clinical observations on keratoplasty patients. The past 30 years have witnessed a wealth of rodent studies on corneal transplantation that have tested hypotheses and dogmas that originated from clinical observations on penetrating keratoplasty patients. Rodent models allow the application of highly sophisticated genetic and immunological tools for testing these hypotheses in a controlled environment and with experiments designed prospectively. These studies have validated some of the widely held assumptions based on clinical observations and in other cases, previous dogmas have been replaced with new insights that could only come from prospective studies performed under highly controlled conditions. This review highlights some of the key dogmas and these widely held assumptions that have been scrutinized through the use of rodent models of penetrating keratoplasty. This review also makes note of new immunological principles of corneal immunology that have emerged from rodent studies on corneal transplantation that most likely would not have been revealed in studies on corneal transplantation patients.
Keywords: Animal models; Corneal transplantation; Immune privilege; Keratoplasty
The notion that grafting corneal tissue to blind eyes might restore vision was proposed over 200 years ago by Charles Darwin’s grandfather, Erasmus Darwin. The first reported case of corneal transplantation in a human subject occurred in 1838 when Kissam transplanted a pig cornea onto a human patient without the use of anesthesia [1]. These and subsequent attempts to transplant pig and rabbit corneas to humans failed due to the intense xenogeneic immune responses that are mounted against tissues derived from other species. However, 1905 witnessed the first successful corneal graft transplanted from a human donor to a human recipient [2]. Since this landmark accomplishment hundreds of thousands corneal transplants have been successfully performed in humans. Indeed, corneal transplantation is the oldest, most common, and arguably the most successful form of solid tissue transplantation [3]. The immunological significance of the first successful corneal transplant in humans would not be fully appreciated until the laws of transplantation were defined nearly a half-century later. The landmark studies by Billingham and Medawar demonstrated that the eye and the anterior chamber of the eye in particular possessed remarkable properties that allowed the prolonged and sometimes permanent survival of corneal allografts placed onto the ocular surface or skin allografts placed into the anterior chamber of the eye [4,5]. Medawar recognized the profound implications of these observations and coined the term “immune privilege” to emphasize the unique properties of the ocular surface and the anterior chamber of the eye [5].
Immune privilege of corneal transplants is widely recognized but frequently misunderstood. A common misconception is that immune privilege provides corneal transplants with complete unfettered exemption from the laws of transplantation immunology. This has led some clinicians to dismiss the concept of immune privilege based on their own experiences in which corneal allografts occasionally undergo immune rejection in their patients. Moreover, a cursory inspection of the statistics for corneal allograft survival reveals that the 5 and 10 year survival rates for corneal allografts are 74% and 62% respectively and are comparable to the survival rates for cardiac, renal, and liver transplants [6]. Even though the long-term survival rates appear to be similar there are two fundamental and profound differences between corneal transplantation and renal, liver, and cardiac transplantation. First, topical corticosteroids are the only immunosuppressive agents normally administered to corneal transplant recipients. However, it could be argued that topical application of steroids to the surface of the corneal transplant, unlike systemically administered drugs used for vascularized organ allografts, has the advantage of delivering the immunosuppressive drugs (i.e., corticosteroids) directly to the site of potential immune reactivity. Second, HLA typing is not routinely performed in corneal transplantation, while it is a mainstay for other forms of solid organ transplantation. The most legitimate approach for comparing corneal transplants with other categories of organ allografts would be to perform each of the various categories of organ transplants under the same conditions. That is, topical application of corticosteroids would be the only immunosuppressive agents used and HLA matching would not be performed. This would be a “fool’s errand” for obvious reasons. However, such head to head comparisons can be made in animal models.
Prospective studies examining the immunobiology of corneal allografts and immune privilege require inbred laboratory animals in which the histocompatibility genes are well defined and congenic animal strains are available. The only laboratory animals that fit this description are the rat and mouse. This prompted Williams and co-workers to develop the rat model of penetrating keratoplasty, which provided investigators the tools needed to address key questions about the immunobiology and immune privilege of corneal allografts [7]. The rat model of penetrating keratoplasty facilitated investigations comparing allografts transplanted to non-privileged sites with corneal allografts grafted to the immune privileged eye. These questions have been addressed in the rat and mouse models of penetrating keratoplasty in which corneal transplants were performed in the absence of any immunosuppressive drugs. Studies showed that immune rejection occurred in only 50% of corneal allografts that were mismatched with the recipient at the entire MHC and all known minor histocompatibility antigen gene loci [7-9].
By contrast, skin allografts transplanted under similar conditions underwent rejection in 100% of the rats [8]. The disparity between corneal allografts and skin allografts was even more pronounced when the donors and recipients were mismatched only at MHC class I gene loci. Only 35% of the MHC class I-mismatched corneal allografts underwent immune rejection while 100% of the skin grafts were rejected [8]. Perhaps the most dramatic example of immune privilege was detected when corneal allografts and recipients were matched at all known histocompatibility gene loci except MHC class II alleles. Under these conditions none of the corneal allografts underwent immune rejection while 100% of MHC class II mismatched skin grafts were rejected [8]. Thus, rodent models of penetrating keratoplasty have allowed investigators to analyze the influence of isolated major and minor histocompatibility genes on corneal allograft survival and to compare the fate of corneal allografts with other categories of organ transplants such as skin allografts. Not only did these studies unequivocally demonstrate immune privilege but they also defined immune privilege with numerical values. It is noteworthy that the incidence of immune rejection in the various categories of mismatched corneal allografts in rats were remarkably similar to those reported in subsequent studies using the mouse model of penetrating keratoplasty [7,9-14].
One of the time-honored tenets of corneal transplantation immunology is that the presence of blood vessels in the graft bed is a harbinger of corneal graft rejection. It was previously believed that the presence of blood vessels provided a conduit for alloantigens to gain access to lymphoid tissues. On the surface this seems appealing, however it is well known that antigens introduced into blood vessels (i.e., i.v. injection) induce immune deviation, which favors the development of immune tolerance rather than immunity [15,16]. Interestingly, it was recognized 45 years ago that the same stimuli that induce the ingrowth of blood vessels also stimulated lymphangiogenesis in the cornea [17]. Another 30 years would pass before the significance of this observation would be recognized when investigators used the mouse model of penetrating keratoplasty [18]. Yamagami and Dana reported that removal of draining cervical lymph nodes on the same side of the neck as the orthotopic corneal allograft prevented the immune rejection of corneal allografts placed into vascularized graft beds [18]. By contrast, removal of the cervical lymph nodes on the side contralateral to the corneal transplant had no effect on graft survival. Moreover, splenectomy did not enhance corneal graft survival. This finding is noteworthy as antigens entering the venous circulation are focused in the spleen, which is the main lymphoid tissue that filters blood-borne antigens. Thus, the notion that the blood vessels in the corneal graft bed promote the induction of alloimmunity is highly unlikely based on these observations. Even more compelling evidence emerged from subsequent studies by Dietrich and co-workers who showed that the selective blockade of corneal lymphangiogenesis with anti-VEGFR-3 antibody or a small molecule antagonist of α5β1 integrin produced a dramatic reduction in corneal allograft rejection, even though these treatments did not affect the development of blood vessels in the graft beds [19].
That is, corneal allografts remained clear in the presence of a luxuriant ingrowth of blood vessels. Rodent models of corneal transplantation have also revealed that corneal epithelial and stromal cells secrete soluble forms of VEGFR-2, which blocks VEGF-C and inhibits lymphangiogenesis but does not inhibit hemangiogenesis [20]. Studies in a mouse model of penetrating keratoplasty found that administration of soluble VEGFR-2 selectively blocked lymphangiogenesis, but had no effect of hemangiogenesis [20]. The selective inhibition of lymphangiogenesis with soluble VEGFR-2 produced a doubling of allograft survival for corneal transplants placed into graft beds that were heavily vascularized with blood vessels but did not have patent lymph vessels. The used of animal models has introduced new strategies for application to the high-risk keratoplasty patient. At the present time the selective inhibition of lymphangiogenesis in clinical practice is not feasible, however blocking both lymphangiogenesis and hemangiogenesis is within reach and might eventually be an effective strategy for application to high-risk patients.
The prudent use of murine models of penetrating keratoplasty has also uncovered yet one more antiangiogenic molecule that impacts corneal allograft survival. Both corneal allografts and syngeneic grafts produce endostatin, which is a proteolytic fragment of collagen XVII and inhibits both hemangiogenesis and lymphangiogenesis [21]. Importantly, syngeneic corneal grafts, which do not elicit an immune response, continue to produce endostatin following orthotopic transplantation in mice. By contrast, corneal allografts, which can provoke alloimmune responses, stop producing endostatin shortly after orthotopic transplantation and thus, reside in a graft bed that is no longer sheltered from lymph vessels [22]. However, subconjunctival injection of exogenous endostatin into eyes bearing corneal allografts significantly enhances graft survival. Thus, the weight of evidence from mouse studies on corneal transplantation has firmly established that the inhibition of new lymph vessel formation in the cornea is crucial for maintaining immune privilege of corneal allografts.
The notion that the eye was endowed with remarkable properties that blunted inflammation was recognized over 140 years ago when the Dutch ophthalmologist van Dooremaal noted the prolonged survival of mouse skin transplants placed into the anterior chamber (AC) of dogs [23]. These experiments were conducted a century before the birth of transplantation immunology, and it would take another 75 years before these findings were “rediscovered” by Medawar who noted the prolonged survival of skin allografts placed into the eyes and brains of rabbits and coined the term “immune privilege” to emphasize the remarkable properties shared by the eye and the brain [5]. The conspicuous absence of patent lymph vessels draining the interior of the eye was believed to sequester antigens in the eye and was viewed by many as the primary mechanism for immune privilege in the anterior chamber [3,24]. However, subsequent investigations revealed that the immune privilege of the AC was a constellation of anatomical, physiological, and regulatory properties that conspired to block the induction and expression of immunity within the eye [24-26]. A major contributor to immune privilege in the AC is the unique spectrum of systemic immune responses that is evoked when antigens are introduced into the AC. Antigens, including foreign histocompatibility antigens (i.e., alloantigens), introduced into the AC elicit a form of immune deviation termed “anterior chamber-associated immune deviation” (ACAID), which culminates in the generation of antigen-specific T regulatory cells (Tregs) that suppress immune responses [24,27]. The juxtaposition of the orthotopic corneal allograft to the AC led many to suspect that alloantigens sloughed from orthotopic corneal allografts entered the AC of the graft recipient and induced ACAID. Indeed, injection of donor alloantigens into the AC of mice and rats prior to the application of orthotopic corneal allografts produces a significant enhancement of corneal allograft survival [28-30].
Emerging evidence supports the hypothesis that corneal allografts induce the generation of donor-specific Tregs that down regulate the immune response and enhance corneal allograft survival [31-35]. A recent study using a mouse model of penetrating keratoplasty revealed that Tregs are generated within the corneal allograft through a glucocorticoid-induced tumor necrosis factor receptor family-related protein ligand (GITRL)-dependent process [36]. Moreover, in vivo blockade of GITR-GITRL interactions through the administration of anti-GITRL antibody abolished corneal allograft immune privilege and resulted in 100% graft rejection [36].
Animal models of orthotopic corneal transplantation have also demonstrated that T cell-derived cytokines can exert a profound effect on the development and function of Tregs depending on the array of alien histocompatibility antigens that confront the host. Corneal allografts that are mismatched with the recipient at all known MHC and minor histocompatibility (H) gene loci survive in 50% of the hosts, even in the absence of immunosuppressive drugs [7,9-11]. Survival of these corneal allografts is closely associated with the generation of Tregs and the production of IFN-γ, as depletion of IFN-γ through the administration of anti-IFN-γ antibody or deletion of the IFN-γ gene results in the loss of Treg activity and culminates in graft rejection [34]. By contrast, depletion of IFN-γ promotes, rather than abrogates, immune privilege of corneal allografts that are mismatched with the host only at minor H gene loci [34]. A similar condition occurs with the T cell cytokine IL-17. Blockade of IL-17 abolished immune privilege of corneal allografts mismatched with the recipients at the entire MHC plus all known minor H gene loci [32,33]. By contrast, in vivo neutralization of IL-17A has the opposite effect on MHC-matched, minor H-mismatched corneal allografts and enhances their survival (Niederkorn-unpublished data). Thus, animal models of penetrating keratoplasty have revealed that T cell cytokines exert profoundly different effects on the fate of corneal allografts depending on the array of histocompatibility antigens that confront the host. Such nuances would not be revealed by retrospective or even prospective studies in human subjects.
For almost two decades, the prevailing dogma proposed that type 1 CD4+ Th1 immune responses were the major, if not the sole, mediators of allograft rejection [37-39]. This led some to propose that tilting the alloimmune response toward a Th2 phenotype would promote allograft survival [39]. However, this proposition was at odds with clinical observations suggesting that patients with allergic diseases have a higher risk for corneal allograft rejection [40-44]. Prospective studies in mice shed light on this apparent paradox and clearly demonstrated that disabling Th1 immune responses by in vivo neutralization of IFN-γ or through the use of IFN-γ-/- hosts resulted in a strong Th2-based alloimmune response and an exacerbation, not mitigation, of immune rejection of corneal allografts [34,45-50]. Moreover, mouse studies incorporating well-defined allergens revealed that Th2-based allergic diseases did indeed increase the incidence and tempo of corneal allograft rejection [45,46,48-50]. By employing a murine model of allergic asthma it was possible to determine that allergic diseases, even those which occur in organs distant from the eye, have a profound adverse effect on corneal allograft survival [48]. Another attribute of rodent models is the ability to produce disease in only one eye, leaving the opposite eye untouched. Allergic conjunctivitis patients have both eyes uniformly exposed to allergens. However, with mouse models of allergic conjunctivitis, it is possible to challenge one eye with an allergen, such as ragweed pollen, while leaving the opposite eye unaffected. Using this approach it was shown that inducing allergic conjunctivitis in only one eye still abolished immune privilege and led to the immune rejection of 100% of the corneal allografts placed into the opposite, allergy-free eye [45]. This is a further testament of the utility of animal models of corneal transplantation and the latitude that such models offer for modifying experiments conditions in a prospective setting.
The old adage “the best defense is a good offense” applies to corneal allografts. Studies in murine models have uncovered novel immunological defense mechanisms employed by corneal allografts to stave off immunological attack. The cornea is decorated with cell membrane-bound molecules that disable immunological effector responses. FasL (CD95L) is expressed on the cell membranes of various cells in the eye including the cornea [51]. The receptor for FasL is expressed on multiple cell types including activated T cells. Fas+ T cells interacting with FasL on corneal endothelial cells undergo apoptosis and are deleted in the eye [51]. Mice bearing the gld/gld mutant do not express functional FasL and cannot induce apoptosis of activated Fas+ T cells. The importance of the FasL/Fas deletion mechanism was demonstrated using corneal allografts prepared from the gld/gld mutant strain of C57BL/6 mice. Between 90% and 100% of gld/gld corneal grafts underwent immune rejection in BALB/c hosts, compared to the 50% rejection that occurred in with wild-type C57BL/6 corneal allografts that expressed functional FasL [52,53]. Corneal cells also express other death receptor molecules such as programmed death ligand-1 (PD-L1), which exerts similar effects in defending the cornea from immunological attack. PD-L1 is expressed on corneal cells and when it engages its receptor on T cells it transmits a signal that inhibits T cell proliferation and induces T cell apoptosis and also blocks secretion of IFN-γ by T cells [54,55]. Corneal allografts from donors lacking the PD-L1 gene or graft recipients treated with anti-PD-1 antibody display a steep increase in the incidence of corneal allograft rejection [54,55].
Although immunohistochemistry and molecular probes can detect FasL and PD-L1 expression on human corneal grafts, the importance of these molecules in promoting corneal allograft survival and in maintaining immune privilege in human subjects can only be inferred. By contrast, murine corneal transplantation studies have unequivocally demonstrated the importance of the death receptor pathway in providing immune privilege to corneal allografts and in repelling immunological attack.
The role of antibody in corneal allograft rejection has been a matter of debate for over three decades. Investigations in mice have shown that corneal allografts can induce the generation of alloantibodies that are capable of producing complement-dependent cytolysis of corneal cells in vitro [56]. However, the evidence for antibody-mediated rejection of corneal allografts is somewhat ambiguous. Passive transfer of alloantibodies induced corneal allograft rejection in one study [57] but induced a transient inflammatory response that resulted in graft opacity, but never culminated in frank graft rejection in another study [56]. This is in sharp contrast to experiments involving adoptive transfer of immune CD4+ T cells, which consistently produces swift immune rejection [58-60]. The variable results in the analysis of antibody-mediated rejection of corneal allografts may lie in the expression of complement regulatory proteins (CRPs) on the cell membranes of corneal cells and that also occur in soluble forms in the aqueous humor that bathes the inner lining of the corneal allograft [61-63]. CRPs are highly effective in disarming the complement cascade and preventing the generation of the membrane attack complex that leads to osmotic lysis of mammalian cells. CRPs not only block the effector function of the complement cascade but may also disable antigen presentation and the induction of immunity [64,65]. One of the CRPs, decay accelerating factor (DAF), has been shown to disturb interactions between antigen presenting cells (APCs) and T cells [64,65]. Corneal graft rejection is significantly elevated when either corneal graft donors or graft recipients are deficient in DAF [66]. Although DAF is most noted for its disarming of activated complement, studies on DAF-deficient corneal allografts did not show any evidence of complement deposition or activation. This in turn, suggests that DAF’s contribution to the immune privilege of corneal allografts is not in preventing complement fixing antibody-mediated injury, but instead appears to be in promoting the expansion of IFN-γ-producing CD4+ and CD8+ lymphocytes and the coincidental down-regulation of anti-inflammatory cytokines IL-10 and TGF-β [66].
One of the remarkable features of corneal transplantation is the low incidence of immune rejection for first time, uncomplicated grafts, even though HLA matching is not routinely performed, especially in the United States [67]. Over the years the issue of whether to perform HLA matching has stirred debate, as some studies have suggested that HLA matching does provide benefit [68]. Investigations in animal models have shed light on the role of individual categories of histocompatibility antigens in eliciting immune rejection of corneal allografts. In the case of fully allogeneic corneal allografts, that is, those mismatched at the MHC plus all known minor H gene loci, approximately 50% of rat or mouse corneal allografts survive long term (Table 1). By contrast corneal allografts mismatched only at MHC class I gene loci, will undergo rejection in only 18-30% of the hosts [8]. The immune privilege of corneal allografts expressing only MHC class II alloantigens is most impressive, with rejection occurring in 0% to 17% of the hosts (Table 1).
Mismatch | Mice | Rat | References |
---|---|---|---|
MHC + minor H | 50% | 50-55% | [7,9,10,73] |
MHC class I only | 30% | 18% | [9,13] |
Minor H only | 45%-53% | 26% | [9,14] |
MHC class II only | 17% | 0% | [9,12] |
Table 1: Effect of MHC and Minor histocompatibility gene mismatches on corneal allograft survival in rodents.
Investigations using animal models of penetrating keratoplasty have demonstrated that the immune response to corneal allografts is shaped by the category of histocompatibility antigens perceived by the host. As mentioned above, corneal allografts mismatched with the host at the entire MHC plus all known minor H gene loci (i.e., fully allogeneic) undergo rejection in approximately 50% of the hosts. However, rejection of fully allogeneic corneal allografts soars to 90-100% in hosts lacking the IFN-γ gene or mice treated with anti-IFN-γ antibody [34,35,47]. By contrast, blocking or depleting IFN-γ enhances the survival of MHC-matched corneal allografts that confront the host only with minor H alloantigens [34]. If these findings can be extrapolated to humans, they suggest that MHC matching combined with modalities that block IFN-γ might have a significant salutary effect in patients with pre-existing conditions that create a high risk for rejection.
The proinflammatory cytokine IL-17 can exert profoundly different effects on the fate of corneal allografts depending of the category of histocompatibility antigens that are perceived by the host. Like IFN-γ, IL-17 is required for the long-term survival of fully allogeneic corneal allografts, as in vivo treatment with anti-IL-17A antibody prevents the generation of Tregs and culminates in the rejection of >90% of the fully allogeneic corneal allografts [33]. By contrast, the same anti-IL-17A antibody treatment enhances the survival of MHC-matched, minor H-mismatched corneal allografts (Niederkorn-unpublished findings).
One of the conundrums in corneal transplantation immunology is the sharp increase in rejection that occurs in patients receiving a second or third corneal transplant [69]. The incidence of rejection rises over three-fold for second and third corneal allografts [69]. This increased incidence of rejection can occur even in patients whose first corneal transplant is clear at the time of the second corneal transplant is grafted to the other eye [70]. The most widely offered explanation for this paradox is that the patient has been sensitized to alloantigens that were expressed on the first graft and the application of a second transplant evoked a recall or memory immune response. However, this explanation is extremely unlikely as most corneal buttons used for transplantation are selected without the benefit of HLA typing and the possibility of selecting a second corneal transplant with the same array of histocompatibility expressed on the first transplant seems remote. A second explanation for this paradox is more generic and posits that rejection of a first corneal transplant alters the eye and renders it more prone to inflammation. In other words, immune privilege has been abolished. The mouse model of penetrating keratoplasty has been effectively employed to analyze the basis for the increased rejection of second corneal transplants. BALB/c mice that had rejected a corneal transplant from the C3H mouse strain rejected 90% of C57BL/6 corneal grafts transplanted to the opposite eye 60 days later, even though first time C57BL/6 corneal allografts are accepted permanently in 50% of BALB/c mice [71]. It is important to note that the C3H, C57BL/6, and BALB/c mouse strains do not share any MHC or minor H alloantigens. Additional experiments revealed that placing a syngeneic BALB/c corneal grafts onto BALB/c mice led to the abolition of immune privilege. Even though the BALB/c syngeneic corneal grafts displayed the same histocompatibility antigens as the BALB/c recipients, these mice rejected 100% of the C57BL/6 corneal allografts placed into the opposite eye suggesting that the surgery, not the immune response to the first corneal transplant, abolished immune privilege in both eyes! That is, perturbation of one eye evokes a sympathizing response in the opposite eye, which was not previously subjected to surgery or trauma. This finding is somewhat analogous to sympathetic ophthalmia, an inflammatory condition in which penetrating injury to one eye is followed by a sympathizing inflammation in the opposite eye [72]. Accordingly, this phenomenon was termed “sympathetic loss of immune privilege” (SLIP) based on its similarity to sympathetic ophthalmia [71]. Further analysis revealed that the severing of corneal nerves that occurs during the orthotopic transplantation procedure elicits the release of the neuropeptide substance P (SP), which disables corneal allograft-induced Tregs. That is, simply making shallow circular incisions in the cornea without even performing penetrating keratoplasty abolishes immune privilege in both eyes and results in rejection of subsequent corneal allografts in 90-100% of the hosts.
The prudent use of mouse models facilitates independent analysis of individual parameters. In the case of SLIP, the effect of severing corneal on subsequent corneal allograft survival was analyzed in isolation from the corneal transplantation procedure itself. The mouse model also made it possible to examine the sympathizing effect of severing corneal nerves on the immune privilege of the opposite eye.
Rodent models have been indispensible for these and other studies and will continue to provide crucial insights into the immunobiology and immune privilege of corneal transplantation in the years ahead.
This study supported by NIH grants EY007641 and EY020799 and Research to Prevent Blindness.