Background

Domestic insecticide interventions such as pyrethroid-treated bednets can substantially lower morbidity and mortality [1] and remain the most commonly advocated methods for malaria prevention. Bednets have revitalized interest in vector control of malaria in sub-Saharan Africa where high transmission levels result in extremely stable malaria prevalence, incidence and clinical burden [24]. Insecticide-treated nets protect their occupants by diverting host-seeking vectors to look for a blood meal elsewhere and by killing those that attempt to feed [5, 6]. Treated nets can therefore also prevent malaria in unprotected individuals by suppressing vector numbers [79], survival [79], human blood indices [10, 11] and feeding frequency [11] in local populations. However, the results of individual studies often differ and although some trials with African vectors have demonstrated substantial reductions of vector density, survival and sporozoite prevalence [79], others have found little or no effects on the vector population as a whole [1214]. These instances where bednets appear to have little effect upon vector populations have been attributed to various factors, including behavioural adaptation and dispersal between control and treatment villages [13, 15, 16], but here we explore the possibility that the ability of vectors to avoid interventions [17, 18] may also contribute to such apparent shortcomings.

Presentation of the Hypothesis

Suppression of transmission over large areas depends upon population-level exposure of vectors to the intervention and this, in turn, depends upon the level of coverage within the human community. Adult vectors, however, can avoid many commonly used insecticides [17], so effective coverage may not necessarily be equivalent to the absolute coverage of humans but may be considerably less if vectors evade it. By avoiding covered humans, vectors may redistribute their biting activity towards those who are not covered by personal protection measures such as treated bednets. Larval stages of mosquitoes are of relatively low mobility compared with flying adults and it is the humans that must bring the control to them rather than vice versa. We therefore hypothesize that the control of adult but not immature aquatic-stage mosquitoes is compromised by the ability of the former to avoid interventions such as excito-repellant insecticides, including bednet impregnation treatments or indoor residual sprays.

Testing the Hypothesis

For the purposes of this analysis, we define effective coverage as the proportion of the vector population that will be exposed to the intervention under given levels of absolute coverage and at a given ability to detect and avoid the intervention. We consider that at any given level of coverage, the vector population equilibrates between covered humans (C) and uncovered humans (U = 1 - C), in accordance with their propensity to avoid (α) the intervention measure, resulting in a steady-state proportion of the vector population that is covered (C*) and uncovered (U* = 1 - C*):

U* / C* = α U / C

Solving for C* and C, yields:

C* = C / α (1 - C) (1 + (C /α (1 - C)))

Here we model the effects typically expected from insecticide-impregnated bednets in African settings, using the Kilombero valley region of Tanzania as an example with a well-studied vectorial system dominated by An. arabiensis Patton. On the basis of detailed experimental hut trials [5, 6], we consider that bednets could approximately halve the baseline values for both the proportion surviving per feeding cycle (Pf*) and the proportion of blood meals taken from humans (Qh) for vectors effectively covered by the intervention. Thus, these key determinants of entomological inoculation rate (EIR) are estimated as weighted averages of those expected for the covered and uncovered populations:

P f * = P f (1 - C*) + 0.5 P f C*

Q h * = Q h (1 - C*) + 0.5 Q h C*

Based on these estimates we calculate the expected human biting rate, sporozoite prevalence and EIR for Namawala, a well characterized holoendemic village as previously described [4], at varying levels of coverage with bednets and varying levels of avoidance by vectors.

The predictions of our model indicate that avoidance behaviour by vectors could severely undermine the effective coverage achievable by bednet programmes, particularly at low and intermediate levels of coverage (Figure 1). Given the robustness of clinical malaria burden to reductions of transmission intensity, [3] such attenuation is of appreciable epidemiological significance. For example, in the absence of any avoidance behaviour (α = 1) bednets at an absolute coverage of 50% were predicted to reduce annual EIR from 246 to 22 infectious bites per year, whereas the same level of coverage with a ten-fold preference of vectors for uncovered versus covered areas (α = 10) would be expected to yield EIR of 161 with only minor reductions of biting rate and sporozoite prevalence (Figure 1). In simple terms, this makes the difference between a programme that can significantly lower risk of clinical malaria in unprotected individuals and one that cannot [3]. This trend is also clear in examining the major underlying determinants of EIR: avoidance can almost completely negate the effects of bednets upon vector survival (P f ) and human blood index (Q h *) at absolute coverage levels of up to 50%. Although less attenuation is observed at higher levels of absolute coverage, such levels are rarely achieved in real programmes and, even then, avoidance can still considerably undermine the ability of bednets to lower or destabilize transmission in an endemic area (Figure 1).

Figure 1
figure 1

The predicted effects of insecticide-treated bednets upon vector bionomics and malaria transmission as a function of the ability of mosquitoes to avoid them. The effects increasing absolute coverage (C) upon effective coverage (C*), survival per feeding cycle (P f *), human blood index (Q h *), annual human biting rate (B h ), sporozoite prevalence (S) and annual entomological inoculation rate (EIR) are depicted as a function of increasing ability to avoid the intervention (increasing α).

The impacts we predicted for vector populations with moderate to high levels of avoidance appear more realistic than those without. Indeed our predictions for mosquitoes which do not avoid bednets are more dramatic than even the most successful field trials [79] and remarkably similar to those used to justify the Global Malaria Eradication Campaign based on indoor residual spraying [19, 20]. Large differences in the excito-repellency of pyrethroid formulations have been reported [6] and may help explain the contrasting effects of bednet programmes which do exert community-level effects [79] and those which do not [1214], supporting the view that insecticide formulations should minimize excito-repellency to maximize effects at the community level.

The huge number of lives that bednets could save remains difficult to realize in practice because of difficulties in maintaining high absolute coverage [21, 22]. Furthermore, vector dispersal can often spread the effects of bednets over wide areas, sometimes making their impact difficult to measure [15, 16, 23]. On the basis of the modeling analysis presented here, we conclude that the effectiveness of bednets may also be restricted by the limiting effects of vector avoidance upon effective coverage. The effectiveness of malaria control programmes are crucially dependent upon not only the extent of coverage but also the ability to target the most intense foci of transmission [24, 25]. Thus, adulticide-based control may be limited because of constantly shifting distributions of biting vectors [2628] and their ability to avoid interventions. A number of field studies have shown that vectors prevented from feeding upon individuals protected by treated nets are not diverted to unprotected humans in the same dwelling or those immediately nearby [5, 29, 30]. However, excito-repellent bednet treatments and indoor residual sprays are known to lower human blood indices in vector populations when applied at the community level [7, 10, 11, 18] so mosquitoes that are deterred from covered homes probably do feed elsewhere upon whatever unprotected humans and alternative hosts are available. Thus it seems that vector biting density may be redistributed to unprotected humans and livestock but over longer distances than have been tested thus far. Nevertheless, this concentration of bites upon unprotected people may not manifest itself as an increased biting rate because it could be counterbalanced by the reduction in the total number of bites taken by the shorter-lived vector population at reasonable levels of bednet coverage (Figure 1). In conclusion, we suggest that vector avoidance of excito-repellent insecticides may considerably limit the impacts of treated bednets and residual sprays on the vector populations and curtail their ability to suppress malaria transmission at the community level.

In this context it may be worthwhile considering alternative methods of malaria control that can complement intra-domiciliary insecticide interventions and augment transmission suppression by integrated programmes. Transmission-blocking vaccines and genetically modified mosquitoes will not be available for several years and their chances of success have been seriously questioned [3133]. In contrast, the complete eradication of accidentally introduced An. gambiae from the north east coast of Brazil [34] and the Nile Valley of Egypt [35], six decades ago, are the only campaigns that have ever completely eliminated an African malaria vector species from a large area. In both these cases, 100% effective coverage was achieved because no specimen of An. gambiae has since been recorded at either site. Both campaigns were executed almost exclusively by ruthless, well-managed larval control [34, 35]. It has been reasoned that these examples are misleading because An. gambiae had colonized areas to which it was not well adapted [17]. Egypt was indeed the northernmost limit of the range of An. gambiae, but the ecological conditions in Brazil seemed ideally suited to it. Descriptions of the flooding valley of the Jaguaribe River are remarkably similar to those of many holoendemic parts of Africa, including the Kilombero valley, upon which we have based our modeling analysis [4]. Furthermore, adult density reached hundreds per house and their exceptional levels of infection could only have been possible with well-adapted, healthy, long-lived mosquitoes [34].

The kind of exhaustive and complete control applied during these intensive eradication campaigns could not be sustained indefinitely, especially in the poorest parts of sub-Saharan Africa. However, a clearly documented example of sustained and successful malaria prevention through larval control in sub-Saharan Africa has recently come to light and, once again, this successful endeavour pre-dates the advent of dichlorodiphenyltrichloroethane (DDT) [36]. An. funestus and An. gambiae were predominantly controlled by environmental management and regular larviciding, based on simple but rational entomological surveys. Malaria mortality, morbidity and incidence were reduced by 70–95% for two decades at quite reasonable expense [36]. There are many other examples of how larval control using standard insecticides and biological control agents [37] have contributed to malaria control in Africa and its associated islands [3842], including Mauritius where local transmission has been sustainably eliminated [43, 44]. However, these are largely descriptive evaluations of operational programmes and larval control has never been evaluated in Africa through rigorous and specific trials similar to those which bednets have been put through [1].

Implications of the hypothesis

The Global Malaria Eradication Campaign marked a notable departure from larval control and focused on adult control with DDT, based on overly confident interpretation of models that failed to account for the mobility of adult mosquitoes as well as the plasticity and inter-species variability of their behaviour [17]. Larval control does not suffer from such drawbacks and should be integrated with more commonly used approaches such as improved access to screening and treatment, bednets or indoor-spraying [4, 25, 45, 46]. Controlling aquatic stages of malaria vectors depends upon finding where and when they occur and targeting them with appropriate intervention measures on a regular and indefinite basis. Given the extensive, diverse and sometimes obscure nature of breeding sites chosen by Afrotropical vectors, this represents a formidable challenge but one that has proven tractable to organized, well-supported efforts [3436], [3844]. Although the historically proven autocratic approaches applied in Brazil, Egypt and Zambia may not be applicable in the increasingly democratic post-colonial Africa of today, relevant administrative capacity and organizational tools, notably mobile phones, geographic information systems and remote sensing data, have become more widely available and could facilitate well-managed abatement programmes in sub-Saharan Africa [25]. Those who eradicated An. gambiae from Brazil and Egypt fully appreciated and exploited its notoriously anthropophilic behaviour. Although the innate preference of this species for human hosts [47] and for larval habitats that are near them [48, 49] makes An. gambiae a devastatingly efficient vector, it also renders its larvae vulnerable to control because they are often relatively easy to locate in association with human settlements and activities [25, 34, 35]. Surely with the advent of modern environmentally-friendly larvicides [42, 5052] and geographic information technology [25], similar success can be achieved by determined efforts on the African continent in the near future? The largest obstacles to the implementation of effective larval control in Africa are practical rather than fundamental because of its dependence on well-organized vertical management and reliable infrastructure. We therefore suggest that rather than constantly looking for methods that do not have to wait upon economic and political development in Africa, those concerned with malaria control need to actively participate in this process so that malaria research and control capacity can be nurtured as an integral part of infrastructure in endemic nations [53].

Perhaps the most depressing indicator of just how much larval control of African malaria vectors has been neglected is that almost all the greatest successes were reported more than half a century ago. Most of the questions that were asked about the larval ecology of these deadly insects over 50 years ago [17] remain unanswered. We propose that larval control strategies against the vectors of malaria in sub-Saharan Africa should be seriously reconsidered and prioritized for development, evaluation and implementation.