Age-related macular degeneration (AMD) is a sight-threatening disease and responsible for 8.7% of blindness globally (1). Neovascular AMD (nAMD) is characterized by abnormal angiogenesis with these atypical vessels leaking, resulting in fluid accumulation, haemorrhage and fibrosis which can lead to rapid central vision loss (2). Although nAMD makes up 15% of total AMD cases (2), it is responsible for the majority of cases of severe visual loss (3).
The advent of vascular endothelial growth factor (VEGF) inhibitors over the past two decades has revolutionized management of patients with nAMD significantly reducing rates of severe visual loss (2). However, treatment with VEGF inhibitors involves injections into the eye with a small risk of infection and the frequent visits and treatment can be burdensome to patients and the healthcare system (2). Up to 15% of patients do not respond well to treatment (4). Identifying prognostic factors early on in the patient’s treatment journey could be helpful when counselling patients. Baseline visual acuity (VA), size of the choroidal neovascularization (CNV) lesion and age have been identified to correlate with long-term VA outcomes in other studies (4).
Nguyen et al. explore whether early responses to VEGF inhibitors can help predict 3-year VA outcomes (5). They studied 2051 treatment-na?ve eyes in a real-world study from Australia, Switzerland and New Zealand with data extracted from the Fight Retinal Blindness! registry (5). After patients were initially loaded with 3× monthly VEGF inhibitor injections, treatment schedules were at the clinician’s discretion for the remainder of the 3-year study, although a treat-and-extend approach was used by the majority of centres. They identified that VA measured at the time of the fourth injection was a strong predictor for VA at 3 years (5).
They divided patients into early loss (≤0 letter gain), small early gain (1–5 letter gain) and large early gain (>5 letter gain) groups depending on their VA outcomes at the 4th injection. There was significant variability in the mean baseline VA of the different groups. The early loss, small early gain and large early gain groups had mean baseline VAs of 60.1, 63.1 and 49.6 letters respectively. By 3 years, the early loss group had lost an average 5.9 letters, the small early gain group had gained 0.7 letters and the large early gain group 12.8 letters. The large early gain groups low baseline VA but subsequent large VA improvement, compared with the other groups with higher baseline VA and lesser 3-year gains, corroborates the well documented “ceiling effect”. This phenomenon describes how those with better baseline VA have limited scope to improve and can appear at first glance to do worse, compared with those who start treatment with a much worse VA (2). The reality is that maintaining a better VA throughout is compatible with a higher quality of life.
Eyes with a “good” VA (defined as more than 70 letters) achieved by the 4th injection were ten times more likely to have good VA by the end of 3 years compared with those eyes with worse vision. This proved to be the only early response factor that was more predictive of long-term VA than baseline VA.
Patients who achieved a “large gain” by the 4th injection were more likely to sustain this gain at 3 years. In contrast, those with an “early loss” in VA with no gain or a loss in VA were unlikely to achieve a good VA by the end of the 3 years. However, they found 32% of patients from the “early loss” VA group went on to make VA gains by 3 years. The authors have hypothesized that these patients are “late responders” and this is evidence in favour of persisting with injections in eyes without permanent foveal structural damage, even if patients are not showing favourable early outcomes. Further studies are required to identify the appropriate time to consider stopping treatment in such eyes.
They also studied the impact of the length of time to CNV lesion inactivity as a predictor of long-term VA outcomes. Eyes with a “short induction” time to lesion inactivity (requiring 3 injections or less) had better VA at baseline and 3 years and were more likely to have a “good” VA at 3 years compared with eyes with “long” induction times (more than 3 injections). However, the change in VA at 3 years was not significantly different between the groups (P=0.09).
The large size of the study and the prospective collection of data lend weight to the findings. However, like most real-world studies there was a high non-completion rate. The 37% of patients who were non-completers were more likely to be part of the “early loss” VA group. However, even a small reduction in long-term VA would likely be better than the natural history of the disease. A further limitation is that VA was recorded in routine clinical practice rather than the strict protocols of a clinical trial. Though real-world data cannot promise the internal validity of a randomised control trial, they can be easier to generalize to routine clinical practice.
The VA outcomes of this paper were compliant with the International Consortium for Health Outcomes Measurement (ICHOM), a standardized set of patient-centred outcomes have been recommended to help standardise real-world data collection (6). This was facilitated by using the Fight Retinal Blindness! platform for data collection which is ICHOM approved. Additional outcomes including reading index and vision-related quality of life data would be useful to gather in the future to help us further understand how the disease and therapy impact patients.
In conclusion, this real-world study identified that VA by the 4th injection had the strongest correlation with long-term VA outcomes. However approximately one third of eyes that did not get significant gain in vision after 4 injections went on to achieve VA gain at 3 years. This is useful when counselling patients about their long-term visual prognosis.