Introduction

Biotic threats such as insects, weeds, fungi, viruses, and bacteria can broadly affect crop yield and quality. Among these, weeds are the most impacting problem causing remarkable yield loss worldwide [1]. The most characterized effect of weeds is competition for resources such as light [2], water [3], space [4], and nutrients [5]. In addition, specific chemical signals and/or toxic molecules produced by weeds may interfere with a normal crop development [6]. A distinctive trait of wild species, including weeds, is their high physiological, morphological, and anatomical plasticity which makes them more tolerant than crop species to environmental stressors [7,8,9,10]. Moreover, weeds interact with other biological components of the environment, acting as refuge for plant pests such as insects, fungi, and bacteria that can harm close in crops [11,12,13]. For example, wild oats (Avena fatua L.) can harbor the etiological agents of the powdery mildew in crops such as wheat (Triticum aestivum L.), oats, and barley (Hordeum vulgare L.) [14]; altamisa (Parthenium hysterophorus L.) can be a secondary host of the common hairy caterpillar (Diacrisia obliqua Walk.) [15, 16]; Cyperus rotundus can host the root-knot Meloidogyne graminicola and, therefore, can contribute to their spreading in the field [17]. Finally, weed infestation may affect fresh and processed products quality such as beer, wine, forage [18, 19]. In this respect, weed residuals may cause accumulation of off-flavors products [20, 21], or in some cases, can make them harmful to humans and animals [22, 23]. Weeds may also contain high levels of allergens and/or toxic metabolites that, if ingested, can cause asthma, skin rash, and other reactions [24, 25].

Most weed research aims at developing strategies that can reduce the deleterious impact of the interspecific competition between crops and weeds and recent technological advances may further contribute to this scope, while improving the sustainability of weed control [26,27,28]. Worldwide, weed competition causes severe yield reduction in all major crops, such as wheat (23%), soybean (37%), rice (37%), maize (40%), cotton (36%), and potato (30%) [1]. Yearly, weeds cause 50% yield losses of corn and soybean productivity in North America. For corn, this equates to a loss of 148 million tons for an economical loss of over $26.7 billion [29]. In Australia, yield loss due to weeds accounts for 2.76 million tons of grain from different plants, including wheat, barley, oats, canola, sorghum, and pulses [30]. The annual global economic loss caused by weeds has been estimated to be more than $100 billion U.S. dollars [31], despite worldwide annual herbicide sales in the range of $25 billion [32]. In Europe, herbicides are the second most-sold pesticides. They accounted for 35% of all pesticide sales in 2018, overcoming insecticides and acaricides (Fig. 1) [33].

Fig. 1
figure 1

Percentage (of total volume in kilograms) of pesticide sales by category in Europe in 2018 [33]

Weed management requires an integrated approach

In 2050, the world population will quadruplicate, reaching 9.15 billion people [34]. However, the predicted increase in food demand will be hardly met by the current production system [35]. Also, climate change will be an additional challenge for the human food supply in the near future [36]. Among all the processes affecting crop productivity, weed management will be one of the hardest challenges [37]. Mechanical and chemical weed control has disadvantages that probably will impede them to be effective for future weed management [38,39,40,41,42,43]. Mechanical methods are scarcely efficient, and herbicides have a high ecological impact. An approach that minimizes the drawbacks of mechanical and chemical weed control is Integrated Weed Management (IWM). IWM combines chemical, biological, mechanical, and/or crop management methods, and represents a model to improve the efficiency and sustainability of weed control [3, 44]. In contrast to traditional methods, IWM integrates several agro-ecological aspects such as the role of conservation tillage and crop rotation on weeds seed bank dynamics [10], the ability to forecast the critical period of weed interference and their competition with crops [45, 46], and the specific critical levels of crops/weeds interaction [47]. Therefore, an effective IWM must rely on a thorough knowledge of crop-weeds competition dynamics, which currently represents one of the most active research areas in weed science [48, 49].

New technologies for site-specific weed management

Precision agriculture relies on technologies that combine sensors, information systems, and informed management to optimize crop productivity and to reduce the environmental impact [50]. Nowadays, precision agriculture has a broad range of applications and it is employed in different agricultural contexts including pests control [51], fertilization, irrigation [52, 53], sowing [54] and harvesting [55]. Precision agriculture can be effectively applied to IWM also. In the last decade, precision agriculture has rapidly advanced because of technological innovations in the areas of sensors [56], computer hardware [57], nanotechnology [58], unmanned vehicles systems and robots [59] that may allow for specific identification of weeds that are present in the field [47]. Unmanned aerial vehicles (UAV) are one of the most successful technologies applied in precision agriculture [60]. Unmanned Vehicles systems are mobile Aerial (UAV) or Terrestrial (UTV) platforms that provide numerous advantages for the execution and monitoring of farming activities [61]. UAVs can be highly valuable since they allow for Site-Specific Weed Management (SSWM) (Fig. 2). SSWM is an improved weed management approach for highly efficient and environmentally safe control of weed populations [28], enabling precise and continuous monitoring and mapping of weed infestation. SSWM consents to optimize weed treatments for each specific agronomical situation [62]. The combination of UAVs with advanced cameras and sensors, able to discern specific weeds [63], and GPS technologies, that provide geographical information for field mapping, can help in precisely monitoring large areas in a few minutes. Thanks to more accurate planning of weed management that can increase mechanical methods effectiveness and/or reduce herbicide spread [64], the potential agro-ecological and economic implications of SSWM are remarkable, yielding lower production costs, reducing the onset of weed resistance, improving biodiversity, and containing environmental impacts [65]. The application of UAVs to weed control can, therefore, contribute to improve the sustainability of future agricultural production systems that must comply with an increasing world population [34, 35].

Fig. 2
figure 2

Site-specific weed management (SSWM) scheme realized by drones and its economical and agro-ecological implications

UAVs remote sensing techniques and sensors

UAVs have become a common tool in precision agriculture [66, 67]. Thanks to their affordability, user-friendliness and versatility, UAVs are often the primary choice for fast and precise in situ remote sensing or survey operations. Despite their versatility, these systems may be used for different purposes, depending on the sensors they carry on. Ongoing research is looking at the best solutions to integrate data collected from sensors on UAVs, ground sensors and other data sources for better management of punctual operations in the field, with a particular focus on smart agriculture and big data management [68, 69].

Although UAVs systems do not offer the same territorial coverage as satellites, they offer a spatial and temporal resolution that other systems do not [70, 71]. From an economic point of view, the use of drones requires the investment to buy a UAV system with at least a 0.1 cm/px resolution RGB camera, a trained pilot for flight management and post-processing software capabilities. The initial UAV investment is compensated by the repeatability of flights, which increases the frequency of datasets delivered, and the higher resolution compared to other systems [72, 73]. UAVs systems also have further advantages: (1) the possibility to collect easily deployable data in real time (excluding post-processing); (2) they can be used to survey areas with high level of hazard and/or difficult to reach; (3) they allow operators to collect data even with unfavorable weather conditions, such as in very cloudy or foggy days, under which satellite detection systems fail or produce very altered datasets [71]. The most important sensors available as payload are mainly categorized into three classes depending on the spectral length and number they can record:

  • RGB (Red, Green, Blue) or VIS (Visible) sensors

  • Multispectral sensors

  • Hyperspectral sensors

RGB/VIS sensors

The RGB or VIS sensors are the most common and largely available commercial cameras (Table 1). Their possible applications have been the focus of most research for years due to their potential and low-cost operational requirements [74, 75].

Table 1 RGB cameras and their main specifications

These sensors are used to calculate vegetation indices such as the Green/Red Vegetation Index (GRVI), Greenness Index (GI) and Excessive Greenness (ExG) with acceptable or high levels of accuracy [76, 77]. Also, RGB sensors have been increasingly used for machine learning techniques in object recognition, phenology, pathologies, and similar purposes. The typical workflow of processing RGB images from UAVs for remote sensing is: 1. pre-flight planning, 2. flight and image acquisition, 3. post-processing and indexes or dataset extrapolation [71]. Phase 1 is critical and essential to collect data of useful quality for the purpose. In the pre-flight planning phase, the parameters to consider are the definition of the study area, the flight altitude, site topography, weather forecast and local regulations for unmanned flights. In phase 2, it is recommended to keep the data flow sufficient to store data and to check if the acquisition platform can acquire the amount of data required. It could be possible to encounter I/O errors due to the inadequacy of the platform with consequent loss of information or abortion of the mission. In phase 3, for RGB sensors, there is no need to perform radiometric calibration, which is the case when using multispectral and hyperspectral sensors. RGB data can be used per se or to create a georeferenced orthomosaic. In this case, the individual images are rectified, georeferenced using GPS data and stitched together to form a single image (orthomosaic) covering the entire study area. Orthomosaics can be generated either with RGB values as they are or after calculating the desired vegetation indices [77]. If RGB images are to be used in machine learning algorithms, the workflow is different [78,79,80,81]. In this case, it is necessary to collect a large dataset of images for the training and testing of the algorithm [82]. This dataset may already be available from third-party sources, such as PlantVillage [83] or PlantDoc [84]. Alternatively, it can be created from scratch if the purpose of the research is not covered by existing datasets [85]. In this case, the acquisition, selection and processing of the images are critical, because the final dataset can affect both the training and the use of the neural network, with risks of producing biased results [86].

Multispectral sensors

The multispectral sensors are used for a wider range of calculations of vegetation indices as they can rely on a higher number of radiometric bands. A comparison of the most common multispectral sensors, specific for UAV systems, is shown in Table 2.

Table 2 Multispectral sensors and their main specifications

With multispectral sensors, the range of vegetation indices that can be monitored is considerably extended compared to those that can be calculated with only three RGB bands. Moreover, the workflow has minor variations. For these sensors, in phase 1, the radiometric calibration and atmospheric correction phases are strictly required. Many multispectral sensors, such as the Micasense RedEdge series or the Parrot Sequoia + , have downwelling irradiance sensors and a calibrated reflectance panel to address some of the requirements for radiometric calibration [87]. Due to a lower resolution of the sensors compared to RGB ones, a lower flight altitude and an adequate horizontal and vertical overlap of recorded images must be taken into account to obtain an adequate ground resolution for the surveyed objective and to avoid missing data [88]. In phase 2, having a higher number of radiometric bands to record, the dataflow will be higher so is critical to avoid I/O errors, missing data or mission failures [89]. Due to multi-lenses nature of the sensors in phase 3, the data collected suffer from the parallax problem. As a consequence, images have to be rectified, georeferenced and must be stacked to generate a single image with different radiometric levels, and calibrated with the downwelling irradiance sensors data acquired during the flight [90]. After this procedure, it is possible to generate a multispectral orthomosaic and then calculate the requested indexes [91]. Multispectral images are also used in machine learning applications [80, 85, 92] taking into account the multi-camera nature of sensors and the different bands recorded. Thanks to the availability of a higher number of radiometric bands, the machine learning algorithms can be extended to not-visible recognition such as early stage plant disease, field quality assessment, soil water content, and more [91].

Hyperspectral sensors

The hyperspectral sensors can record hundreds to thousands of narrow radiometric bands, usually in visible and infrared ranges. To deal with hyperspectral applications, the choice of number and radiometric range of bands is critical. Each band or combination of bands, being very narrow, can detect a specific field characteristic. Each hyperspectral sensor can detect only a certain number of bands, so the aim of survey must be very clear to choose the right sensor. Although hyperspectral sensors have decreased in price in recent years, they are still an important starting investment since they are much more expensive than RGB and multispectral sensors. In addition, they are heavier and bigger than other sensors, often making their use on UAV systems difficult and/or excessively onerous in terms of payload. Some of most used hyperspectral sensors in UAVs application and their main characteristics are shown in Table 3.

Table 3 Hyperspectral sensors and their main characteristics

In this case, the workflow for radiometric calibration is more complex compared to other sensors. Some calibration methods needed for these sensors are derived from manned aircraft hyperspectral platforms, based on artificial targets to assess data quality, to correct radiance, and to generate a high-quality reflectance data-cube [93]. In phase 1, the planning must also be carried out in time and not only in space because, in addition to the spectrometric resolution, hyperspectral sensors have a temporal resolution due to the different acquisition method. In phase 2, it should be considered that both images’ size and data flow are bigger than multispectral/RGB images. Moreover, these sensors may acquire a large amount of data, but the payload limitations of UAVs may not allow the transport of adequate file storage systems. Phase 3 for hyperspectral images is critical: quality assessment is one of the critical issues of hyperspectral data and some problems associated with the quality of the images have not been completely overcome. Among those, the stability of the sensor itself (due to the nature of UAV platforms) and the vibrations involved can comprise a good calibration of the sensor. Subsequently, on post-processed data, it is possible to calculate narrowband indices such as chlorophyll absorption ratio index (CARI), greenness index (GI), greenness vegetation index (GVI), modified chlorophyll absorption ratio index (MCARI), modified normalized difference vegetation index (MNDVI), simple ratio (SR), transformed chlorophyll absorption ratio index (TCARI), triangular vegetation index (TVI), modified vegetation stress ratio (MVSR), modified soil-adjusted vegetation index (MSAVI) and photochemical reflectance index (PRI) [94].

Applications of UAVs to weed management

UAVs are ideal to identify weed patches. The main advantages of UAVs compared to UTVs are the shorter monitoring/surveying time they require and optimal control in the presence of obstacles, which is critical when working between crop rows [95]. In a few minutes, UAVs can cover many hectares flying over the field, thus providing the photographic material for weed patches identification [61]. These images are processed via deep neural network [78], convolutional neural network, and object-based image analysis [96, 97]. Based on a systematic review of the literature concerning weed identification by UAVs, it can be concluded that mainly three types of cameras are used for weed patches identification: RGB, multispectral and hyperspectral cameras (Table 4). These cameras are very similar in terms of information obtained for the purpose of weeds identification. Indeed, the three camera types can recognize weed patches with good accuracy depending on flying altitude, camera resolution and UAV used. UAVs have been mainly tested on important crops such as Triticum spp., Hordeum vulgare, Beta vulgaris, Zea mays [98,99,100,101]. These are among the most cultivated crops worldwide and are highly susceptible to weed competition especially in early phenological stages. In these crops, it was possible to identify several dicotyledonous weeds including Amaranthus palmeri, Chenopodium album and Cirsium arvense [102,103,104], as well as different monocotyledonous such as Phalaris spp., Avena spp. and Lolium spp. [105, 106]. These weed species are widespread globally and can be a serious threat to different crops [107, 108]. Therefore, the combined use of UAVs and image processing technologies may contribute to effectively control different weed species interfering with the crops with relevant environmental benefits [28, 109].

Table 4 Weed patches identification by different types of camera (multispectral, RGB, hyperspectral)

Conclusion

The use of UAVs and machine learning techniques allow for the identification of weed patches in a cultivated field with accuracy and can improve weed management sustainability [97]. Weed patches identification by UAVs can facilitate integrated weed management (IWM), reducing both the selection pressure vs herbicide-resistant weeds and herbicides diffusion in the environment [64]. Recent research has shown that new technologies are able to discern single weed species in open fields [63, 106, 126]. If integrated with weed management planning, this information gathered via remote imaging analysis can contribute to sustainably improve weed management. In addition, imaging analysis can help in the study of weed dynamics in the field, as well as their interaction with the crop, which both represent a necessary step to define new strategies for weed management based on interspecific crop–weed interactions [127,128,129]. Recent studies demonstrate that some weed communities are actually not detrimental to crop yield and quality [127, 128]. In winter wheat cultivation, a highly diversified weed community caused lower yield losses than a less diversified one [129]. In soybean, through a combination of field experiments in which weed species were manipulated in composition and abundance, it has been shown that increasing levels of weed competition resulted in an increase in seed protein content without impairing yield [130].

Most likely, the integration of known and emerging technologies in this field will greatly improve the sustainability of weed control, following the SSWM approach. By image analysis, different machine learning techniques will be able to provide a reliable overview of the level and type of infestation. Specific algorithms can be trained to manage weeds removal by Autonomous Weeding Robot (AWR), via herbicide spray or mechanical means [131]. Also, the creation of a specific weed images dataset is crucial to achieve this goal. This approach must necessarily rely on a dataset of photographs taken in dedicated experimental fields, labeled in extended COCO/POCO (Common Objects in COntext/ Plant Objects in COntext) format [86] and integrated with images from PlantVillage dataset [83] or other existing ones.

New insights on weed population dynamics and their competition with crops are needed in order to extend this approach to real agricultural contexts, so as to specifically recognize and eliminate only harmful weed species. The overall objective is to overcome the consequences of biological vacuum around the crop, which has been proved to be highly impacting for both biotic and the abiotic components of the environment [132, 133], with long-term consequences on human safety on earth.