Background

There has recently been a substantial decline in malaria incidence in Africa [1, 2]. Some of the decline can be explained by the massive deployment of insecticide-treated bed nets (ITNs), the introduction of artemisinin combination therapy (ACT), or in some places, by the use of indoor residual spraying (IRS). But these intervention effects alone do not explain all the changes that have been seen. In some places, vector populations have fallen, despite the absence of organized vector control programmes [3], while in others there has been no decline at all [4].

Scaling-up of interventions creates the risk that parasites will develop resistance to drugs [5, 6], or that vectors will develop chemical [7, 8] or behavioural resistance [911]. However, not all selective pressures generated by interventions make malaria control more difficult. For instance, there is some evidence that parasites can be selected for renewed sensitivity to chloroquine [12]; albeit potentially on a short-term basis. Other pressures placed upon mosquito vectors by the rapid scaling-up of ITNs and IRS may have potential to drive selection for new behaviours or phenotypes that could reduce their capacity to transmit.

Presentation of hypothesis

The major African vectors Anopheles gambiae sensu latu and Anopheles funestus have adapted over many generations to feed on sleeping humans in mud-brick dwellings, and then rest indoors while digesting their blood meal. This behaviour is under threat from the massive scaling-up of ITNs and IRS [2] which makes host-seeking and resting inside houses far more hazardous for mosquitoes. Concurrently, economic development is improving living conditions in many African countries, and increasing the prevalence of houses with corrugated-iron roofs, brick walls and screened windows that are more difficult for mosquitoes to enter and provide poor resting places for them [13]. These factors combine to increase the fitness costs associated with feeding on humans (anthropophagy), and have potential to generate selection for a shift towards feeding on non-human animals and/or biting and resting outdoors[1416]. Such shifts may be especially likely if the fitness costs associated with adopting such novel phenotypes are minor [17].

An additional, previously unconsidered and potentially beneficial evolutionary consequence of control measures targeted at the main anthropophilic vectors of malaria in Africa is the selection for a shift in life history that reduces their ability to sustain parasite transmission. Like most organisms, mosquito vectors face trade-offs between investment in reproduction and survival [18], with malaria parasites being intimately dependent on the latter. This is because parasites, such as Plasmodium falciparum, require at least 12 days for development inside their vectors before they can infect a new host [19]. However, mosquito longevity in the wild is often low, with < 10% of An. gambiae sensu strictu females surviving long enough for parasites to complete their incubation [20]. Consequently any mosquito life-history shift in favour of short-term reproduction at the expense of longer-term survival would greatly reduce mosquito transmission potential. Evolutionary-based approaches to reduce malaria transmission by targeting mosquito survival have been recently proposed in the context of late-life acting insecticides [e.g. [2123]]. This approach is thematically linked to the hypothesis presented here in its general aim of deploying vector control interventions that do not prompt a detrimental evolutionary response in mosquitoes that would hinder control (e.g. insecticide resistance). However these approaches differ in that the 'evolution-proof' approach requires that the control approach places little or no selection on mosquitoes, whereas the current hypothesis requires the generation of selection but in the direction of reducing intrinsic mosquito survival.

Testing the hypothesis

Evolutionary theory [24, 25] and empirical studies demonstrate that increased exposure to sources of extrinsic mortality can generate selection for increased intrinsic mortality [26, 27]. In brief, if external factors make the odds of long life minimal then organisms would be pushed to prioritize early reproduction at the expense of longevity. Laboratory studies have demonstrated that this phenomenon can occur in arthropods exposed to life-shortening pathogens [28]. By similarly increasing extrinsic mortality, the dissemination of insecticidal interventions could thus also place intense selection on malaria vectors for increased intrinsic mortality. Since malaria transmission is highly sensitive to the survival of the adult female mosquito, even a small increase in intrinsic mortality could have profound epidemiological benefits [29]. Indeed, this represents the theoretical rationale for the use of both IRS and ITNs as tools for reducing malaria transmission.

Unfortunately, testing this hypothesis is not easy. A comparison of the fecundity and intrinsic mortality of different wild-caught mosquito populations in relation to the extrinsic mortality rates would be logistically challenging. Recent progress in the establishment of semi-field systems for experimental study of mosquito ecology [e.g. [30]] may increase the feasibility of comparing intrinsic mortality in progeny of wild-caught mosquitoes. However, estimating the relative contribution of intra-specific genetic variation to geographical differences in mosquito mortality, given their ecological and taxonomic complexity, will remain a challenge. As the methodological approaches for study of these phenomena in wild vector populations develops, some a priori hypotheses can be generated from research previously conducted on wild Drosophila populations [26]; dipterans often used as model organisms for Anopheles genetics and physiology.

In a series of long-term selection experiments conducted in an insectary, Stearns and colleagues [26] exposed a variety of strains of Drosophila to different extrinsic mortality regimes over a period of five years (50-90 generations). Selection under high extrinsic mortality resulted in both an increase in fecundity, and a small increase in (intrinsic) mortality when assayed in the absence of selection (Figure 1)[26]. While for the high intrinsic mortality lines the median survival time (58.4 days) was only 7.7% lower than that of the low intrinsic mortality lines (63.3 days), this difference in relative survival would correspond to an approximately 80% reduction in transmission in a typical endemic setting (Figure 2) given the non-linear relationship between mosquito survival and the vectorial capacity described originally by Garrett-Jones [31], who showed that within the Ross-MacDonald model a reduction in daily survival probability of the vector from p 1 to p 2 results in a reduction in vectorial capacity from C 1 to C 2 as:

Figure 1
figure 1

Kaplan-Meier curves comparing intrinsic mortality of Drosophila populations after selection. Dashed line: low extrinsic mortality selection regime; continuous line: high extrinsic mortality selection regime. Data replotted from [26].

Figure 2
figure 2

Predicted relationship between reductions in adult malaria vector survival and their transmission potential. The reduction in vectorial capacity was calculated using Garrett-Jones' original formula for the vectorial capacity, assuming a daily survival of 90% in the original vector population, and a 12-day duration of malaria sporogony. The vertical arrow corresponds to a 7.7% reduction in survival.

C 2 C 1 = p 2 p 1 n log e p 1 log e p 2

where n is the number of days required for the parasite to complete its extrinsic incubation period [31]. On this basis it appears possible that increased mosquito extrinsic mortality induced by the high coverage of long lasting insecticide treated nets (LLINs) that has now been achieved in many parts of sub-Saharan Africa could select for increased intrinsic mortality with substantial impact on malaria transmission over and above the immediate protection that LLINs provide to communities.

Implications of hypothesis

The theoretical possibility that such selection could occur does not demonstrate that it is playing a role in recent reductions in malaria transmission in Africa. Amongst other considerations, the magnitude of the hypothesized survival effect depends on all other epidemiologically-relevant aspects of vector ecology remaining the same (e.g. the vector-human ratio, blood feeding rate on humans, duration of sporogony). Any correlated change in these parameters prompted by intervention use could magnify or diminish the transmission effects proposed here. For example, the enhanced investment in short-term reproduction hypothesized here could be manifested as an increase in the number of eggs laid in one clutch. This could intensify competition during larval development, which through its connection with mosquito population growth [32] and adult survival [33] might prompt even greater transmission reduction. Proof of the existence and epidemiological importance of such a phenomenon will require a multi-pronged approach including at least: (1) confirmation of the heritability of intrinsic mortality and reproductive schedule in wild vector populations [34]; (2) evidence of standing genetic variation in these traits; (3) demonstration that the fitness costs imposed by these interventions are of sufficient magnitude to alter the fecundity-longevity trade-off; and (4) development of better tools for tracking the age and fecundity of mosquitoes in natural populations, and their response to increases in vector control coverage. Furthermore, as some empirical studies have shown that extrinsic mortality pressure can select for a net increase in lifetime reproductive success in the absence of pressure [35], caution would be required to ensure that any epidemiological advantages arising from increased vector mortality would not be undermined by an upsurge in the size of their populations.

The possibility of the phenomena described here has important consequences. It provides a reminder that the evolutionary processes induced by interventions against disease vectors may not always act to neutralize intervention effectiveness. Secondly, it argues that large-scale distribution of highly effective interventions could have unpredictable effects. The selective pressures on life history traits depend on local ecology, for example, mosquitoes with high innate fecundity may have the greatest advantage in areas of abundant rainfall where larval habitat is not limiting, whereas in areas with long dry seasons selection may favour mosquito longevity over short-term reproduction [36]. Mosquito evolutionary responses to insecticidal interventions against malaria (e.g. insecticide resistance) have been correlated with a reduction in their ability to transmit other pathogens [37]; suggesting that the selection imposed by these measures could have potentially numerous unanticipated effects on disease risk. The complexity of the parasite-vector-human-environment system will continue to present challenges for prediction. In the search for new control strategies, wide consideration should be given to both the potential epidemiological disadvantages and advantages of evolutionary processes resulting from their implementation.