We examined the effect of matching for HLA-A and B antigens on the success of corneal transplantation in a single-center, prospective, masked study that began in March 1979. The study involved 97 consecutive recipients at high risk because of prior corneal graft rejection or serious vascularization of the native cornea. Donor corneas were selected on the basis of ABO-blood-group compatibility, a negative lymphocyte crossmatch, and optimal HLA-A and B matching; all clinical personnel were 'masked' to the degree of HLA matching during the study. Among 38 patients receiving corneas with a good HLA match (two or more antigens), only 8 (21 percent) had graft rejection, as compared with 29 of 59 (49 percent) with a poor match (no or one antigen) (P <0.010). The mean (±SE) difference in rejection-free graft survival increased with time: 88.4 ± 5.5 percent versus 73.5 ± 5.9 percent at six months, and 80.1 ± 7.5 percent versus 38.5 ± 7.9 percent at two years. Cox multiple regression analysis, which included the HLA-A and B match and nine other potential confounding variables and risk factors also identified a significant (P <0.009) relative risk (4.6) of rejection reactions as well as irreversible graft rejection (P <0.016; relative risk; 11.2) with poor HLA-A and B matching. Our findings indicate that good HLA-A and B matching yields a significant long-term benefit in reducing the number of episodes of graft rejection and subsequent failure in high-risk recipients of corneal transplants.
|Original language||English (US)|
|Number of pages||7|
|Journal||New England Journal of Medicine|
|State||Published - 1986|
ASJC Scopus subject areas