FormalPara Key Summary Points

Age-related macular degeneration (AMD) is a leading cause of severe irreversible vision loss, with a global prevalence that is predicted to substantially increase.

Identifying early biomarkers indicative of risk for progression from intermediate AMD to vision-threatening late stages is key to ensuring individualized management and timely intervention.

There are already several well-established structural biomarkers, including drusen volume and pigmentary abnormalities, as well as a range of multimodal biomarkers under investigation that provide additional information about risk in the future.

Further work is needed to establish the most promising functional biomarkers indicative of risk for disease progression, as well as other risk factors, and their utility in clinical care and future trials.

Introduction

Age-related macular degeneration (AMD) is a leading cause of severe irreversible vision loss in high-income countries [1,2,3,4]. The global prevalence of AMD is estimated at 170 million people, and this is expected to increase to 288 million people by 2040; in Asia, prevalence is expected to double over this time [5, 6]. Risk factors for AMD include age, genotype, smoking, and other lifestyle factors [7].

AMD is a progressive disease, and the Beckman classification can be used clinically to define disease stage as early, intermediate, or late depending on the worse-affected eye [8, 9]. Early AMD is typically defined as the presence of medium drusen (≥ 63–124 µm) without pigmentary abnormalities [8, 9]. Intermediate AMD (iAMD) may be defined as the presence of at least one large druse (125 µm or larger in size) and/or pigmentary abnormalities, or as the presence of medium drusen with pigmentary abnormalities [8, 9]. Finally, late AMD is classified as either neovascular AMD (nAMD) and/or geographic atrophy (GA) [8, 9].

A systematic review in 2020 estimated the prevalence of early AMD or iAMD in Europe at 25% in people aged ≥ 60 years [10]. The rate of progression from iAMD to late AMD is estimated at approximately 27% over 5 years [11]; however, data are highly variable and depend on definitions used to define the earlier stages of disease [12]. Among eyes with iAMD, progression to late-stage disease is to either nAMD and GA, or both [13, 14]. Although lifestyle modifications and supplements can be beneficial in reducing the risk of progression from earlier stages of AMD to late AMD, no specifically targeted intervention has been approved to prevent development of iAMD or to slow its progression to late AMD once developed [11, 15]. To help develop interventions that aim to prevent or delay progression, or even cause regression of disease, it is essential to better understand who is at greatest risk of progression from iAMD to late AMD and improve our ability to quantifiably track disease progression over time. This would allow clinicians not only to identify patients at greater risk of progression to late-stage, vision-threatening disease, so that their monitoring and counselling can be individualized, but would also allow priority inclusion or stratification of high-risk groups into future interventional clinical trials. Furthermore, it would provide parameters to monitor over time for the assessment of disease progression and help us to better understand potential mechanisms that contribute to different risk profiles and progression rates.

However, the structural biomarkers used as endpoints in current clinical trials of AMD (presence and/or growth of exudation or atrophy [16, 17]) are associated with late stages of the disease when photoreceptor death and loss of best-corrected visual acuity (BCVA) have already occurred. Identification of biomarkers that could act as early-stage surrogate endpoints in clinical trials, prior to substantial visual decline, would be of great clinical utility and increase the feasibility of earlier interventional studies. This will be critical in reducing the growing burden of AMD in the coming decades [5, 6]. In this review, we evaluate potential early structural, functional, genetic, and systemic biomarkers that could be used to assess the risk of progression of iAMD to late vision-threatening stages of disease. Thus, we reviewed the available literature for iAMD biomarkers and their association with progression to late AMD. Structural biomarkers, both conventional and emerging, are reviewed, followed by functional and blood-based parameters as well as genetic biomarkers, selecting only those biomarkers for which there is a minimum body of evidence available (Fig. 1). This article is based on previously conducted studies and does not contain any new studies with human participants or animals performed by any of the authors.

Fig. 1
figure 1

Biomarkers that may be used to assess the progression of intermediate age-related macular degeneration (AMD). AMD age-related macular degeneration, BCVA best-corrected visual acuity, cRORA complete RPE and outer retinal atrophy, EZ ellipsoid zone, GA geographic atrophy, iRORA incomplete RPE and outer retinal atrophy, LLVA low-luminance visual acuity, nAMD neovascular AMD, nGA nascent GA, OCT optical coherence tomography, PRO patient-reported outcome

Structural Biomarkers

Conventional Drusen

Conventional drusen (hereafter referred to as “drusen”) are sub-retinal pigment epithelial (RPE) deposits of lipids and proteins that vary in size; small drusen are classified as < 63 μm, intermediate as 63–124 μm, and large as ≥ 125 μm [18]. Drusen are the hallmark of AMD, and disease progression is strongly associated with an increasing load of drusen [9, 19,20,21,22,23]. However, drusen may also occur at younger ages, often forming recognizable patterns suggestive of an inherited disease, such as the dominantly inherited drusen phenotypes of Malattia leventinese and Doyne’s honeycomb dystrophy rather than typical age-related deposits [24]. Cuticular drusen are another drusen subtype of round yellow lesions between 50 and 75 µm in diameter that can appear at a relatively young age [25, 26]. In time, genetic analysis may be able to differentiate individuals with different drusen phenotypes from those with typical AMD drusen and may help to uncover their unique etiology; these subtypes may require a different treatment strategy to the more typical conventional drusen phenotypes.

Increasing drusen volume is a risk factor for AMD progression; patients with a drusen volume of ≥ 0.03 mm3 are four times as likely to develop any late AMD as those with a volume of < 0.03 mm3 [27]. Baseline RPE/drusen complex (RPE/DC) thickness (a metric for assessing drusen volume) is significantly correlated with progression to central and non-central GA (both p < 0.001), with the odds of developing central GA increasing by 32% for every 0.001 mm3 increase in baseline RPE/DC thickness [21, 28]. The location of drusen may also affect their impact on AMD progression: current evidence suggests that extramacular drusen are not directly associated with progression from iAMD to late AMD but may lead to greater macular drusen size, while increasing central drusen volume does increase the risk of disease progression [27, 29, 30]. Thus, drusen volume remains a major risk factor for AMD progression [21, 31].

Other aspects of drusen may increase their utility as a biomarker, such as internal drusen reflectivity. The internal reflectivity of drusen can vary in appearance, from low-reflective cores to conical debris [20]. Veerappan et al. proposed four subtypes for reflective drusen sub-structures: high-reflective cores (H-type), low-reflective cores (L-type), conical debris (C-type), and split drusen (S-type) [20]. Hyporeflective cores have been associated with increased likelihood of progression to late AMD, although some data indicate that they are not an independent risk factor [32,33,34].

Drusen regression is strongly correlated with progression to late AMD and subsequent atrophy [35]. However, it is recognized that drusen can fluctuate, with some regressing completely without subsequent atrophy [36, 37]. Overall, drusen are a well-established and well-studied biomarker; tracking the lifecycle of drusen shows them increasing in volume with increasing disease severity, followed by regression and atrophy [38]. It is therefore difficult to use the tracking of drusen volume alone as a biomarker for disease progression, as drusen increase or decrease depending on their lifecycle stage. Thus, change in drusen volume alone is insufficient for use as a biomarker of disease progression or regression or as an endpoint in early disease intervention studies, but could be complementary in conjunction with other functional or structural biomarkers if considered to be associated with mechanism of action of the intervention.

Reticular Pseudodrusen

Reticular pseudodrusen (RPD), also known as subretinal drusenoid deposits, are distinguished from conventional drusen by their location above the RPE [39,40,41]. With the advent of multimodal imaging, RPD are recognized as being more frequent than previously realized based on clinical examination of color fundus photography (CFP) alone. They are an area of research focus, as they appear to be a critical sub-phenotype in AMD. RPD can occur in eyes with or without other retinal diseases but appear highly prevalent in AMD [41].

RPD are particularly prevalent in eyes with late AMD and are considered to be a risk factor for progression in the fellow eye of those with late AMD [41,42,43,44]. Although not all studies report RPD as an independent risk factor in individuals with non-late AMD, a post hoc analysis of data from the Age-Related Eye Disease Study (AREDS) 2 study reported that the presence of RPD carried a significant risk for the development of late AMD, especially GA (p < 0.0001) [45]. Eyes with RPD have also been reported to have more rapid growth of atrophic lesions, and with lesions growing towards the RPD [46, 47]. The presence of RPD has been reported to modify the effect of treatment in people with iAMD, suggesting that the RPD phenotype may require different intervention compared with eyes with only drusen [48]. Therefore, determining the presence of RPD assessed by multimodal imaging in patients with AMD is important if we are to fully understand the factors that drive disease progression. Further research that assesses the extent of RPD (e.g., by volume and/or area) could assist in establishing a more quantitative relationship between RPD and progression from iAMD to late AMD, and algorithms are being developed to help with this task [49].

Pigmentary Abnormalities

In addition to drusen volume, pigmentary abnormalities assessed by CFP are the other major traditional risk factor used when assessing AMD severity [23] and the risk for progression to late AMD [50]. Such changes are seen either as regions of hypopigmentation or hyperpigmentation [23]. Pigmentary abnormalities, together with drusen size, make up the factors used in the AREDS classification system for risk prediction [23, 51]. In the AREDS study, the presence of hyperpigmentation and hypopigmentation preceded the onset of GA in 96% and 82% of eyes, respectively, with a mean time to onset of 5 and 2.5 years, respectively [52].

Hyperreflective Foci

Hyperreflective foci (HRF) are defined as lesions seen on optical coherence tomography (OCT) as roundish, hyperreflective lesions in any of the retinal layers. They vary in appearance and location and were originally associated with hyperpigmentation as seen on CFP, calcified drusen, and RPE elevation in AMD [53]. HRF have been generally believed to be migrating RPE cells, but recent work has suggested that many HRF may reflect macrophages that engulf RPE pigment [54, 55]. Not all HRF have detectable hyperpigmentation seen on CFP, suggesting origins other than the RPE [53, 54].

There is some spatial correlation between macular hyperpigmentation as visualized using CFP and hyperreflective foci imaged with OCT, although it is not absolute [56]. The area of HRF in eyes with iAMD correlates with the likelihood of progression to late AMD and development of GA over different periods of time from 1 to 2 years [57, 58]. The topography of HRF may be of specific importance in predicting progression to GA; one retrospective analysis suggested that the concentration of HRF in a given area has a significant impact (p < 0.0001) on local atrophy progression [59]. Intraretinal HRF have also been associated with the development of type 3 macular neovascularization in patients with iAMD and have been in some cases posited to represent early stages of intraretinal neovascularization [60,61,62]. A better understanding of the origins and differences between HRF may well be important when considering predictors of risk.

OCT Signs of Early Atrophy

OCT imaging has allowed us to visualize the very first signs of atrophy, long before they are noted on a CFP or on clinical examination. Incomplete RPE and outer retinal atrophy (iRORA) and complete RPE and outer retinal atrophy (cRORA) are relatively new terms proposed to describe changes seen on OCT as atrophy develops in eyes with AMD [63]. In 2018, the international consensus of the Classification of Atrophy Meeting (CAM) group proposed OCT nomenclature to describe the OCT changes seen [63]. The key structural OCT criteria for cRORA are: a region of signal hypertransmission into the choroid of ≥ 250 μm, colocalized with a zone of attenuation or disruption in the RPE band of ≥ 250 μm in length, and evidence of overlying photoreceptor degeneration. These changes can only be assessed in the absence of an RPE tear [63]. A CAM report in 2020 focused on the defining features of iRORA, which was considered to precede cRORA [64]. iRORA is defined on OCT as a region with the same criteria as cRORA except that the signal hypertransmission into the choroid and zone of RPE attenuation are < 250 μm in length [64]. This terminology was proposed as unifying nomenclature such that researchers could then continue to research these changes in a longitudinal manner and determine the risk associated with their presence [64]. Several studies have now looked at the associated risk of progression in eyes with iRORA to either cRORA or GA [65]. In the sham arm of the LEAD study, which enrolled people with bilateral large drusen, without nascent GA (nGA, see below; [66]), 21% of the cohort had iRORA at baseline; an additional 31% developed iRORA over a 3-year period [67]. Prevalent or incident iRORA was associated with an increased rate of GA development (adjusted hazard ratio [HR] 12.1) [48, 67]. Another study reported the prevalence of iRORA in AMD to be around 16%, similar to that reported from LEAD [68]. No reports have, as yet, been published for the prevalence of cRORA in non-late AMD, although the LEAD cohort, which only included eyes with iAMD without nGA, had 3% with cRORA at baseline (personal communication).

“Nascent GA” (nGA) is a term that was proposed in 2014 and suggested as an early surrogate endpoint for studies of iAMD progression; it was defined by the presence of subsidence of the outer plexiform layer (OPL) and inner nuclear layer, and/or a hyporeflective wedge-shaped band within Henle’s fiber layer of the OPL [66]. In one longitudinal study of eyes with iAMD, nGA was shown to be highly predictive of progression to GA (probability of progression at 24 months was 38%; adjusted HR 78.1; p < 0.001), and the development of nGA explained 91% of the variance in time to development of GA [65].

Currently, these early signs of atrophy are not regulatory authority-approved endpoints that can be used in registration studies of potential interventions that aim to slow progression of non-late AMD. However, they can be used to enrich a population of high-risk iAMD that are more likely to progress and could potentially provide an earlier endpoint for use in early phase clinical trials to determine efficacy of interventions. Continued efforts are required to better define the changes that occur as cell death commences and eyes progress towards vision-threatening lesions. Precise anatomical definitions may require refinement as researchers and reading centers try to implement the current definitions into pragmatic clinical trial risk factors and endpoints [69]. The required OCT signs may also vary depending on the design and aims of a study. For example, a study might enrich a high-risk cohort with iRORA or nGA or could use nGA or cRORA as a surrogate endpoint for GA. Ultimately, linking these anatomical changes to functional correlates will be important to understand the functional implication of these anatomical signs.

OCT Signs of Neovascular Conversion

Nonexudative macular neovascularization (neMNV) has only recently been appreciated as an entity and, when present, signifies a high risk of conversion to exudation MNV [70,71,72]. As such, recognizing OCT signs that suggest their presence is invaluable for appropriate counselling of patients. One such sign is a specific subset of the double-layer sign, characterized by the presence of a shallow, irregular RPE elevation (SIRE) [73]. In a cohort of 233 eyes, all those with neMNV had the SIRE sign, while 92% of eyes without neMNV similarly lacked SIRE. SIRE was found to have a positive predictive value of 25% and negative predictive value of 100% for neMNV [73].

One ongoing research study (EYE-NEON) is investigating the prevalence and incidence of neMNV and the role of several biomarkers as predictors for conversion to nAMD [72]. Thickening of the subretinal layers and presence of subretinal hyperreflective material (SHRM) are also early signs that may suggest the development of neovascularization and potential exudation [31, 74, 75].

Emerging Structural Biomarkers

Several emerging structural biomarkers, currently less established than those previously discussed, may play an important future role in determining the risk of progression of iAMD. Although they require further investigation and validation, their inclusion in studies may enrich the information available for triaging patients at high risk of progression to late AMD.

The term “ellipsoid zone” was first used in 2014 to describe a hyperreflective band thought to represent the photoreceptor inner segment ellipsoid [76,77,78,79], which is dense with mitochondria and thus central to the health and function of photoreceptors [78]. Changes in the intensity of the ellipsoid zone, as measured indirectly by OCT, may therefore be a biomarker for retinal disease severity and progression [76]. However, ellipsoid zone reflectivity is not directly measurable by OCT but instead calculated through complex post processing; therefore, its use may be limited in clinical settings, at least perhaps until artificial intelligence (AI) will be able to assist with the analysis. Decreased intensity of ellipsoid zone reflectivity has been reported in a study of eyes with iAMD or in another of eyes with non-neovascular AMD and may indicate early photoreceptor damage [76, 80, 81].

Persistent hypertransmission defects (assessed by human graders using en face swept-source OCT as bright lesions with greatest linear dimension > 250 µm [82]) have also been proposed as an early predictor of GA formation [82, 83]. In one study of iAMD, 96% developed persistent hypertransmission defects of ≥ 250 μm before GA formation [82]. Furthermore, the development and growth of hypertransmission defects can be tracked in individuals with iAMD [83, 84]. The grading of persistent hypertransmission defects was found to be repeatable in one study reporting 98.2% accuracy and 97% agreement between graders over 1177 defects [83].

Choroidal thickness maps can be constructed with enhanced-depth imaging of wide-field swept-source OCT and typically include the layers between Bruch’s membrane and the chorioscleral interface [85]. Choroidal thickness is thinner in patients with AMD than in those without retinal pathology [86]. In one study, reduction in the choriocapillaris complex thickness was significantly associated with AMD progression over 5 years of follow-up (p < 0.001): a thickness of < 10.5 µm was associated with a high probability of progression, whereas a thickness of > 10.5 but < 20.5 µm was associated with moderate probability [87]. However, the relationship between choroidal thickness and AMD status is affected by several other factors, such as axial length, age, and possibly the presence of RPD, which complicates the reported associations [86, 88].

With the recent advances in imaging, it is possible to examine the choroid, including the choriocapillaris, without using invasive dye-based tests. OCT angiography (OCTA) has propelled increased interest in studying the blood supply to the outer retina, its characteristics, and their association with AMD. Interpretation of the images and acknowledging the need to deal with artifacts (such as the shadows cast by drusen) are both areas that require considerable expertise to ensure that correct associations are drawn between potential blood flow deficits and AMD [89,90,91]. With these caveats in mind, data from a number of small studies suggest that flow deficits increase as AMD progresses and may be linked to the development of iRORA and cRORA [92,93,94,95,96]. However, the relationship between choriocapillaris alterations and iAMD progression has not been consistently observed in all studies. One study found no significant difference in nonperfused areas between patients with unilateral, bilateral, or no iAMD [97]. It is therefore not yet clear what the temporal association is between flow deficits and the beginning of atrophy and what role, if any, flow deficits may play as risk factors for AMD disease progression.

The precise relationship between these emerging structural risk factors and disease progression requires further research. Additional published data, including the ability to consistently grade across reading centers, will help draw firm conclusions about the relationship of these biomarkers and disease progression.

Functional Markers and Patient-Reported Outcomes

Alongside patient-reported outcomes (PROs), functional vision tests are an important aspect of disease monitoring. Functional tests can be conducted under high-luminance and high-contrast conditions or under reduced-luminance and/or varying contrast conditions. Functional deficits within a cohort, all with the same stage of disease (e.g., iAMD), can vary along a broad spectrum, with many patients being functionally indistinguishable from those without disease. In addition, given the subjective nature of functional tests, their reproducibility both between and within patients is likely to be more variable than grading structural changes. This provides challenges when considering the longitudinal tracking of these parameters and their potential as surrogate endpoints.

Best correct visual acuity (BCVA), a high-luminance, high-contrast test commonly used to assess visual function [98], is often normal in people with iAMD. Data from the large-scale MACUSTAR study indicate that the majority of patients with iAMD can be functionally indistinguishable from control populations [99, 100]. However, a few studies have shown that, compared with controls, patients with iAMD can have significant deficits in BCVA (p < 0.05) [101] and that BCVA declines in iAMD patients over 12 months [99, 102]. Unlike late-stage disease, in which vision is imminently threatened, BCVA is not considered a useful parameter to track progression of iAMD. As such, other functional measures need to be considered.

Low-luminance visual acuity (LLVA) can be measured by placing a neutral density filter in front of either the study eye or an illuminated letter chart and asking the participant to read the chart [103, 104]. Compared with controls, patients with iAMD have significant deficits in LLVA (p < 0.05) [101]. Nevertheless, research examining LLVA along with microperimetry in the same cohort has suggested that LLVA has limited sensitivity for detecting progression to late AMD [105, 106].

Microperimetry is a form of visual field test that examines retinal sensitivity to light in multiple locations across the macula, in conjunction with direct fundus examination, permitting correlation between pathology and function [107]. Furthermore, microperimetry can track a change in function over time [108]. However, microperimetry remains a subjective test, requiring engagement and concentration from the patient as well as a reasonable level of vison to ensure good fixation. Its accuracy is therefore reduced at lower levels of visual acuity, such as when disease advances [108]. Both mesopic and scotopic microperimetry show significant reduction in eyes with iAMD, although like BCVA, there is considerable variation among non-late AMD cohorts [109,110,111]. Mesopic microperimetry has a structure-function relationship in eyes with drusen [112], and reduction in mesopic sensitivity has been demonstrated in early and iAMD [113,114,115]. One strategy to improve the ability to show change over time could be to individualize the test grid and so gain information in specific regions identified as being at risk of progression, such as areas showing early OCT signs of atrophy. Although time intensive to perform, scotopic microperimetry may be of particular use in iAMD cohorts [116]. In eyes with earlier stages of AMD, scotopic retinal sensitivity is more decreased than photopic or mesopic sensitivity compared to healthy controls [117, 118]. With reasonable test-retest reliability [111], retinal sensitivity as assessed by microperimetry may be a useful functional early biomarker for patients with iAMD, being able to track progression.

Patients with non-late AMD will often volunteer that they have challenges with dark adaptation; they have difficulty going from a bright area into one that is dimly lit, often needing to wait some time before they can see again [119, 120]. This can happen when driving into a dark tunnel or underground car park or going indoors after being outside on a bright day. This functional deficit is known as dark adaptation [119] and has been shown to be one of the earliest functional deficits in AMD [121,122,123]. There is much interest in early deficits in dark adaption in AMD, with prolongation seen in patients with iAMD and decline shown over 24 months as disease progresses [121, 122]. The dynamic processes of rod recovery, such as using the rod intercept time (RIT), seem most promising as a biomarker [124]; absolute threshold testing is impractical as it can take well beyond a reasonable time to complete the test. Even in normal elderly populations without signs of AMD, those with an abnormal dark adaptation time (mean RIT of 15.1 versus 9.1) were almost twice as likely to develop AMD 3 years later [124]. Dark adaptation deficits also appear to be considerably worse in those with AMD and RPD [121, 123, 125] and may therefore help us to understand the underlying pathology associated with RPD. Dark adaptation may be one of the most promising functional tests in terms of identifying early deficits but is not simple to conduct, can be lengthy, and is not enjoyed by participants, making it difficult to scale up and implement in large trials, or in clinical practice.

Contrast sensitivity is impaired in patients with iAMD [126]. Examining mesopic (compared to photopic) contrast sensitivity with the Pelli-Robson chart has been shown to identify functional deficits in iAMD that can be differentiated from normal aging [127]. However, inter-session variability for Pelli-Robson scores is high, which could limit their utility as a biomarker that tracks progression in clinical trials [128]. The cone-specific contrast test (CCT), which was recently employed in a natural history study [121], is an alternative-varying contrast test that presents randomized colored letters that are visible only to L, M, or S cones in decreasing steps of contrast [129]. Compared with controls, patients with iAMD have significant deficits in red cone-specific contrast (p < 0.05) [101, 121]. The CCT is yet to be widely adopted; therefore, limited data are available. Automated assessment of contrast sensitivity has also been examined under low-luminance conditions, with deteriorating performance correlating with disease progression [130]. Although contrast sensitivity is a simple test to perform in the clinic, further work is needed to determine whether it can be used as a biomarker for iAMD progression.

At present, there are limited published data on PROs as a means to track and/or measure iAMD progression, and further formal validation will be needed to assess their utility [131]. Common PRO measures, including the 25-item National Eye Institute Visual Function Questionnaire (NEI-VFQ-25) [132], Functional Reading Independence (FRI) Index [133], and Impact of Vision Impairment Questionnaire (IVI) [134], do not appear sensitive enough to capture changes at early disease stages [135]. The Night Vision Questionnaire (NVQ-10) attempts to capture difficulties experienced by patients in low-light conditions and has been used to assess function in patients with iAMD [135]; nonetheless, its utility in iAMD remains unclear. The MACUSTAR study has developed a new PRO for patients with AMD (the 37-item Vision Impairment in Low Luminance [VILL-37]) [131, 136], which will potentially provide improved metrics.

Although functional testing and identifying the best parameters to measure are challenging, functional measures better reflect the impact of the disease on patients. As such, they are extremely important to include in any interventional trial and are of great interest to regulatory authorities. Using a combination of both structural and functional biomarkers may potentially, as combined endpoints, improve our ability to track and/or predict progression of iAMD. Such an approach has been used in glaucoma in which optic disc biomarkers are combined with visual field testing [137, 138].

Blood and Plasma Biomarkers

Over the past decade, there has been discussion on whether AMD should be categorized as a systemic inflammatory disease [139, 140]. To support this contention, there have been reported alterations in peripheral blood flow in those with AMD compared with normal control populations, and there are a number of studies suggesting potential associations between changes in protein or lipid levels detectable in the blood and plasma and risk of AMD and its progression [141, 142]. C-reactive protein (CRP) level has been suggested as a possible independent risk factor for progression from iAMD to late AMD, as elevated CRP levels have been associated with late AMD (odds ratio [OR] 3.12; 95% confidence interval [CI] 1.38–7.07) and AMD progression (OR 1.90; 95% CI 0.88–4.10) [143,144,145]. Interestingly, a combination of elevated CRP and the CC (Y402H) genotype in the CFH gene resulted in a superadditivity of risks, with an OR of 19.3 (95% CI 2.8–134) for late AMD and 6.8 (95% CI 1.2–38.8) for AMD progression [146]. In another study, CRP was significantly associated with choroidal thinning in patients with iAMD (p = 0.01) [88].

High levels of high-density lipoprotein cholesterol (HDL-C) in plasma have been reported and are a risk factor for progression to late AMD [147, 148]. HDL-C levels of one standard deviation higher than the mean have been associated with ORs for AMD of 1.17 in European populations (p < 0.001) and 1.58 in Asian populations (p < 0.001) [147]. Regarding newer findings, one study found that patients who progressed to late AMD (median time to conversion: 25.2 months) could be differentiated from those who did not progress according to levels of lysozyme C, trefoil factor 3, ribonuclease KS6, and SAP3 [149]. Although these biomarkers could indicate the risk of progression for an individual patient, more work is needed to validate the findings and to determine how best to use them to assess risk of progression.

Other markers of chronic inflammation, most notably interleukin (IL)-6, have been associated with late-stage AMD; one systematic review found that although early AMD was not strongly associated with elevated IL-6 levels, late AMD (both GA and nAMD) was associated with significantly elevated IL-6 levels (p = 0.003) [150]. In one small prospective study elevated plasma levels of both IL-6 and IL-8 were apparent in patients with GA compared with healthy controls; furthermore, plasma levels of IL-6 correlated with GA enlargement rate (R2 = 0.23; p = 0.0035) [151]. However, it is not yet clear what happens to IL-6 levels in earlier stages of disease, and as such we do not know whether IL-6 plasma levels could be used as a biomarker for risk of progression to late AMD.

Plasma metabolomics may also provide some insight into the underlying pathophysiology of AMD and its risk of progression. One study has identified eight baseline metabolites that are significantly (p < 0.01) associated with AMD progression at 3 years: N6, N6, N6-trimethyl-L-lysine, phenylalanine, methylsuccinate, n-methyl-hydroxyproline, ribitol, n-palmitoyl-sphingosine, pregnenolone disulfate, and 1-linoleoyl-2-linolenoyl-GPC [152]. The most significant associations with progression were ribitol (p < 0.0002) and pregnenolone disulfate (p < 0.0014) [152]. These are very early results, with further research needed to validate them.

Genetic Markers

A vast array of genes is associated with the risk of developing AMD; the largest genome-wide association study published to date identified 52 common and rare single nucleotide polymorphisms (SNPs) across 34 loci [153], although these numbers have since increased with additional studies reporting at least 103 loci and 69 SNPs now identified [154, 155]. Genetic associations have strongly implicated the complement system as playing a central role in the pathophysiology of AMD, with the first identified genetic association with AMD being CFH. Since then, SNPs in other loci in the complement system have been linked to AMD, including CFB, C3, and C2 [156,157,158]. In addition, another strong genetic association has been identified with variants in the ARMS2/HTRA1 gene [158,159,160]. The majority of risk genes identified to date are associated with the complement, lipid metabolism and extracellular matrix pathways [46, 160]. Adjusting for phenotype and demographic variables, and dependent on methodology, approximately 40–80% of incident AMD can be attributed to genetic factors [161]. How exactly these genes interact with lifestyle risk and aging to contribute to AMD risk, or its progression, remains unclear.

Identifying genetic associations which are specific for AMD disease progression has, to date, not been particularly fruitful. However one study reported that the TT ARMS2/HTRA1 genotype for rs10490924 increases the risk of late AMD by tenfold [159], with another study reporting that people who carry a risk haplotype at ARMS2/HTRA1 progressed to late AMD an average of 9.6 years earlier than those without the risk haplotype [160]. Furthermore, patients with risk variants in both CFH and ARMS2/HTRA1 appear to progress earlier than those with risk variants in CFH alone [158]. Current work is focusing on compiling genetic data into the development of polygenic risk scores for AMD, which may help identify risk of progression as distinct from risk of the disease per se [162, 163]. As our understanding of the genetics underlying AMD deepens, more key pathways will be highlighted that could be targeted with interventions, exemplified by current activity around the complement pathway [164].

Use of Biomarkers for iAMD in Clinical Trials

Some of the biomarkers described above have been included in interventional iAMD clinical trials, although in many cases as secondary or exploratory rather than primary endpoints (Table 1). The primary endpoint of the completed interventional LEAD study (NCT01790802) in patients with high-risk iAMD was disease progression defined using a multimodal imaging endpoint comprising nGA, as well as traditional late-stage disease stages, and used OCT, FAF, and CFP imaging to define these endpoints. This was the first study to use combined structural endpoints to facilitate the feasibility of doing early intervention studies in AMD [48]. In the LEAD study, change in drusen volume [48, 99], LLVA, and microperimetry mean sensitivity were also included as secondary endpoints [99]. A few other interventional studies have included assessment of drusen (area or volume) as an endpoint for patients with iAMD: a completed Phase I trial examining elamipretide (NCT02848313) [165], a completed pilot study of supplement T7082 (NCT04778436), the ongoing Phase II DELPHI trial examining atorvastatin (NCT04735263), and the ongoing Phase II trial of QA108 granules (NCT05562219); results are not yet available. There is also one ongoing study that includes conversion from iRORA to cRORA as an endpoint to evaluate the progression of iAMD (the REVERS trial, NCT05056025) and one assessing development of new iRORA: a Phase II trial of iptacopan (NCT05230537). A further ongoing Phase Ib trial examining ASP7317 includes changes in the ellipsoid zone as an endpoint for patients with either iAMD or GA (NCT03178149).

Table 1 Summary of clinical trial endpoints

Studies examining the natural history of iAMD will provide a wealth of information on biomarkers that could be used to assess progression from iAMD to late AMD, and a number of the biomarkers discussed here are included as endpoints in these observational studies. The Duke FEATURE study (NCT01822873; N = 101) examined longitudinal changes in visual function metrics in patients with iAMD over 24 months, including dark adaptation, CCT red, and microperimetry sensitivity [121]. A slow, non-linear functional decline was reported, with a potential structure-function relationship noted among RPD, hyperreflective foci, and dark adaptation [121]. Endpoints in the MACUSTAR study (NCT03349801; N = 718) included structural, functional, and patient-reported outcomes, such as LLVA and microperimetry, measured at 6-month intervals over ≥ 36 months, with results expected to be reported in the coming years [136]. HONU (NCT05300724; planned N = 400) is an ongoing natural history study that will examine the rate of conversion from iAMD to iAMD with nGA and iRORA and then to GA and cRORA. In addition, the BIRC-01 and BIRC-02 observational studies (NCT04469140; N = 450, NCT03688243; N = 225) include assessment of drusen volume as secondary endpoint, along with changes in choroidal perfusion parameters. The PINNACLE study (NCT04269304; N = 428) is examining patients with iAMD over the course of 1–3 years using a range of OCT, OCTA, and autofluorescence endpoints and will use machine learning to predict disease progression. The longitudinal ALSTAR2 study (NCT04112667; N = 480) is examining patients with either no disease or early AMD over a 3-year period to assess structure-function relationships in aging and early AMD [166]. Finally, the Immuno AMD Study will examine markers of immunosenescence in blood samples, using proteomics to assess biomarker expression in patients at different stages of AMD [167], although further details are yet to be released.

Role of Artificial Intelligence in the Detection and Quantification of iAMD Biomarkers

Emerging AI methods may be used to automatically detect and quantify iAMD biomarkers. A potential advantage of automated detection and quantification of biomarkers is the ability to identify a large number of biomarkers in a cost-effective, rapid throughput manner. This will allow generation of additional evidence and consensus, expedite screening of certain predefined characteristics of AMD, contribute to building a consensus set of biomarkers for iAMD, and assist in the selection of appropriate patients for clinical trials. A number of published methodology papers in the rapidly developing field of AI and iAMD biomarkers are available as well as papers using an increasing number of algorithms to detect specific biomarkers.

One group has recently developed a deep-learning model to classify the presence of iRORA and cRORA in an OCT B-scan [168]. The model predicted the presence of iRORA and cRORA within the entire OCT volume with high area under the receiver-operating characteristic curve (AUROC) performance (N = 1117 volumes; iRORA, 0.61; cRORA, 0.83) [168]. The OptinNet deep learning model has been trained to identify “points of interest” in SD-OCT scans of patients with AMD; it classified drusen, RPE, retinal nerve fiber, and choroidal layers as of interest in 337 scans of 62 eyes with AMD [169]. Deep learning has also been employed to detect RPD from OCT scans; agreement with human graders was 0.6, versus 0.68 agreement between two human graders [49].

In addition, a deep-learning system has been developed by the AREDS2 Deep Learning Research Group to classify AMD according to the presence of RPD, GA, and pigmentary abnormalities [170]. Similarly, CFP images from the AREDS study (N = 4139 participants) have been used to build an automated AMD stage classification model (“iPredict-AMD”), which achieved 99.2% accuracy [171]. Multimodal deep learning approaches have also been used to combine OCTA and structural OCT data to predict AMD biomarkers with resulting accuracy of up to 90% [172].

Machine learning has also been used to predict progression in patients with AMD with a machine-learning model used to assess conversion from iAMD to late AMD [31]. In one study, the most critical quantifiable features for progression were outer retinal thickness, hyperreflective foci, and drusen area. For GA specifically, the model had an AUROC performance of 0.80 when differentiating between converting and non-converting eyes over 24 months [31]. A machine-learning approach has also been used to generate a risk-stratified classification for AMD progression in patients with iAMD based on OCT data and demographic variables [173]. In one study that compared predictive models for progression to late AMD based on manual CFP grading and/or automated OCT imaging analysis, the AUROC for each individual model was similar (0.80 and 0.82, respectively) [174]. However, the combined model was not notably superior to either imaging model alone (AUROC of 0.85) [174], suggesting that the AI method used in this study was not significantly better at predicting progression than traditional manual CFP grading [174]. Finally, one deep learning model that is still under development had an AUROC of 0.945 and 0.937 for predicting short-term (current or following year, respectively) progression from iAMD to GA based on qualitative and quantitative SD-OCT features [175]. AI models are improving rapidly, and the hope is that in time they will provide better prediction of AMD progression to vision-threatening disease than clinician assessment alone is currently able to provide.

Challenges of Using AI

If AI models are able to rapidly and accurately identify key biomarkers under investigation in AMD, this could speed up evidence generation and assist in validation of these biomarkers. Such classification methods could expedite screening of patients with AMD with certain predefined criteria. However, much more needs to be achieved before they can be widely implemented. Foremost among issues to be addressed are a lack of standardization in imaging protocols and competing methodologies as well as limited available datasets for testing machine-learning methods. This currently leads to low external validity and reproducibility and inconsistency in reporting metrics. In addition, large imaging datasets often lack the required metadata, such as demographic data or inclusion criteria, for the development of reliable, generalizable models [176].

In 2021, the American Academy of Ophthalmology in partnership with the National Eye Institute produced a set of recommendations to standardize imaging datasets, recommending that machine-readable, discrete data should be provided and that lossless compression of pixel/voxel data be used to encode raw data [177]. In addition to this, in 2022 the Collaborative Community on Ophthalmic Imaging released guidance on the foundational considerations for use of AI for retinal images in ophthalmology, emphasizing the importance of nonmaleficence, autonomy, and equity in the design, validation, and implementation of AI systems [178]. However, until there is implementation of these consensus guidelines on imaging protocols, as well as regulatory approval to be used outside a research setting, using AI effectively in clinical trials and ophthalmology in general will remain challenging.

Conclusions

The investigation of biomarkers to predict and monitor iAMD progression to vision threatening late-stage disease is an exciting area of intense investigation. There is a potential to greatly improve our ability to research earlier stages of AMD and improve patient outcomes in a disease that is ever increasing in prevalence. However, further work is needed to establish the most promising biomarkers and their use in clinical care and future trials. Central to the discussion of biomarkers for AMD is the need for markers that predict and can be used to follow disease progression itself and those that might be able to be used as endpoints in interventional clinical trials. Ongoing work to identify and validate specific biomarkers that indicate risk of disease progression will improve our ability to assess, counsel, and track the patients that are at greatest risk of progressing from iAMD to late-stage AMD as well as other biomarkers that might be able to act as early surrogate endpoints in clinical trials.

Among the many biomarkers currently being evaluated or developed for use in detecting iAMD progression, structural biomarkers such as drusen volume, pigmentary abnormalities, and early signs of atrophy have the most supporting evidence and are already proving useful as clinical trials. Some emergent biomarkers merit further investigation and development with the ultimate goal of being able to predict and measure progression in AMD. The inclusion of functional markers will also be critical as we move towards identifying interventions that appear to slow progression of disease and seek regulatory approval. These need to include parameters other than BCVA, with changes in perimetry, dark adaptation, and contrast sensitivity appearing to be the most promising early parameters at present. Retinal sensitivity assessed through microperimetry is considered one of the most reliable biomarkers of topographic retinal function in iAMD but is not currently used in regular clinical practice and requires expertise to administer; though informative, at present it would not alter disease management. More work is needed to establish the precise time periods over which changes in these biomarkers can be accurately and reproducibly measured and to find the most pragmatic approach to measuring functional changes in large clinical trials.

Genetic biomarkers may be useful for stratifying patients into different risk categories. Further work is required to identify serum biomarkers that could be used to predict risk of disease or to detect progression. The idea of combining risk factor development as endpoints in trials is being explored, as is the potential role of AI in supporting these endeavors.

Critically, a continued dialogue with regulatory authorities on the establishment of clinical trial endpoints must be maintained to enable the design of earlier interventional trials so that potential treatments can be tested in a timely and cost-effective manner. In addition to the work on structural and functional biomarkers, the development of better PROs will support future regulatory approval and therefore is pivotal to the future of research in AMD.

Although many unanswered questions remain, new developments in the rapidly progressing fields of imaging, functional testing, AI, genetics, and serum biomarker assessments are likely to yield future opportunities that are currently impossible to predict. A multimodal, multifactorial approach may ultimately yield the most accurate means of monitoring and predicting progression towards vision-threatening, late-stage AMD.