Abstract
In recent years, agriculture has become a major field of application and transfer for AI. The paper gives an overview of the topic, focusing agricultural processes and technology in Central-European style arable farming. AI could also be part of the transformation process of agriculture that is emerging world-wide in response to the UN global sustainable development goals (SDGs). In that sense, our overview includes pointers to some research towards AI in future agricultural systems.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction: AI Fits Ag Tech
For millennia, agriculture has been a critical field of technology leading to abundant innovations and applications. Its criticality is currently emphasised by the dramatic increase in world population and rapid climate change: world agriculture is an essential factor in achieving the UN Sustainable Development Goals. Artificial Intelligence (AI), on the other hand, is an obvious candidate to contribute to modern agricultural technology (Ag Tech): The OECD definition of AI SystemsFootnote 1 [136], as recently adopted by the EU for the AI Act, is a perfect fit within the context of machinery or decision-support for agricultural processes under their large variety of independent environment dynamics, incomplete control and limited observability.
Consequently, according to [174], over 3900 articles have been published regarding the topics of AI in agriculture in 2022 alone, which can be found in SCOPUSFootnote 2. This extends a trend of continuously growing research interest in the field with distinctly more momentum in the past ten years. The history of publications is now long enough to have changed focus topics several times, highlighting how AI has already become an integral part of agricultural research, and this tendency is likely to increase: Applications of decision-support systems, learning approaches to sensor data interpretation problems and robotics are of considerable interest to researchers worldwide, resulting in an impressive body of work [44, 174]. In addition, AI-based functionalities are included in operator assistance or process control components in many modern agricultural machines.
To provide an introductory overview of this work, this article aims to present some of the recent developments of AI in Ag Tech. To keep this overview manageable in size, we restrict ourselves mostly to arable farming and its machinery, treating the large application field of animal husbandry and many other aspects of modern agriculture only very briefly, if at all.
As is typical in real-world application domains, there is no strict one-to-one mapping of AI sub-fields or methods to domain machinery or processes. This means the intersection between AI and agriculture is not as simple as “for ploughing, use AI topic (A), for seeding (B), etc.”; rather, all agricultural processes typically encompass the whole spectrum of AI technology. In this paper we address agricultural processes first (Sec. 2) and then switch to the perspective of AI technologies (Sec. 3). We conclude with a view about the role of AI in future agricultural innovations.
2 Application Domains
The domain dynamics and inherent partial information in agricultural environments lead to numerous processes that are – in principle – perfectly well suited for the application of AI technologies. As mentioned, the primary focus in this paper lies on arable farming, but we also include some greenhouse applications as well as a short glimpse into animal husbandry since this topic deserves some attention in a survey about AI technologies in agriculture. The structure is based on general farming processes as shown in agricultural textbooks such as [39, 111], while an overview of on-field applications as a general introduction to the basics of agriculture is given and serves as a basis to support the position that AI is not just some interesting optimisation tool. Agricultural applications of AI, we argue, may be crucial to create a sustainable and environmentally friendly technological landscape that can contribute to feeding a growing world population.
As a reminder to our AI audience we’d like to stress that, in our experience, successfully transferring AI or any other technology to an application domain demands substantial domain-specific knowledge and expertise. We cannot and will not provide an overview of the entire domain of agriculture, and will instead sketch some basic processes, structures, and conditions (mostly in arable farming) through the lens of AI research, in order to establish a suitable background for a discussion about agricultural applications of AI. Readers knowledgeable about agriculture may safely skip this section.
2.1 Soil Tillage and Seeding
Soil tillage is fundamental work in agriculture for preparing the soil to produce strong and healthy crops. This process usually takes place before seeding and can be performed in various mechanical working steps, such as ploughing for turning the soil or using a harrow to develop a crumbling surface [98]. In combination with crop protection, soil tillage ensures the suppression of weeds, pests and diseases through the mechanical treatment of harvest residues or weeds.
An important difference needs to be made between conventional and conservation soil tillage, where the latter foregoes the usage of a plough [102]. Ploughing has been a basic and regular tool for intensive conventional farming, but in recent decades, conservation soil tillage with its non-turning soil cultivation (by e.g. a cultivator) has gained importance all over the world [19, 34, 124]. While the usage of a plough comes with a clean, residue-free soil surface, its negative effects cannot be neglected; these include impacting the soil structure and fauna (especially earthworms), the occurrence of plough compactions and its potential cause for erosion. For this reason, conservation soil tillage is a suitable alternative for a stable soil structure, better soil fertility and less erosion [101].
The goal of conservation soil tillage is to perform the process of mulch sowing by establishing a cover of plant residues from previous crops on the soil surface by shallow soil cultivation, in which the seeding itself can be applied. Under specific arid conditions with low disease and weed pressure, the procedure of no-tillage (direct sowing process without soil cultivation) can be performed, but it needs to be further investigated in more humid regions [101]. A combination of both no-tillage and mulch sowing can be achieved by a so-called strip-till, where only a narrow soil strip is cultivated for seeding with the goal of combining the benefits [183].
In general, conservation soil tillage needs to be linked to preconditions such as dry soil or specific crop rotation due to higher weed and disease pressure. Unlike the conventional approach, conservation soil tillage is typically associated with a higher application of pesticides as well as the necessary usage of total herbicides (such as glyphosate), which are effective on all kinds of plants, but also one of the main points of criticism [101, 102].
This is also the point where AI can start to play a significant role: in recent times there has been a noticeable increase in spot-spraying applications and intelligent mechanical solutions, which have led to very promising approaches that may reduce the usage of pesticides by a substantial amount [58]. This in turn may be the key to more frequent deployments of conservation soil tillage or even no-tillage, especially with regard to intensive agriculture.
2.2 Weed Detection and Control
Weeds have always been a concern of farmers around the world, competing with crops for valuable resources such as nutrients and sunlight while potentially reducing yields by significant amounts. Until the 19th century, before the development of chemical weeding approaches, weed regulation was a purely manual task. Since approximately 1950, chemical weeding became a ubiquitous technique in arable farming until the 1980 s, when environmental concerns gained traction [190]. Modern agriculture generally follows the principles of integrated pest management, which means that non-chemical procedures such as crop cultivation methods or biological approaches are preferred over chemical ones, minimising the negative effects on the environment. This concept has been accepted and incorporated into public policies and regulations in the European Union, although most cropping systems still depend on the heavy use of pesticides [178].
The development of technical solutions, based on AI, has become a promising approach to face recent regulations such as the framework of the European Green Deal. These regulations aim to reduce the application of pesticides by 50% by the year 2030 [48], and include an ongoing discussion about a near-term prohibition of glyphosate within the EU. The resulting AI-based solutions can be a factor contributing to healthier ecosystems and preventing chemical herbicide resistance in weeds [31].
Viable commercial solutions with AI already exist (see Sec. 3.4), not to mention numerous scientific developments such as the smart sprayer attachment by [140], a mechanical approach [25] or the Asterix weeding robot [185], which uses computer vision for weed detection in conjunction with a spray controller while claiming a reduction of herbicide use by up to 90%.
In general, we maintain that the field of weed detection and control has immense potential to connect environmental standards with high yields by using AI.
2.3 Biotic and Abiotic Stress Monitoring
Beyond weeds, biotic factors such as insects, viruses and fungi as well as abiotic factors, including drought, chemical and physical soil conditions, may all have a sizeable impact on the productivity of crops and thus their yields [59, 125]. Similar to weeding (Sec. 2.2), but starting in the 1930 s, control of biotic stress has been carried out through the widespread and indiscriminate use of crop protection products (such as DDT), leading to environmental problems and eventually to more careful and integrated crop management approaches [105]. Future ecopolitical regulations and challenges such as population growth, however, increase the pressure to improve yields and enhance environmental quality, thus forcing the application of all available advanced technologies to face these issues successfully [164].
A modern and efficient approach to biotic and abiotic stress monitoring consists of satellite and aerial remote sensing. Since 2015, satellites (such as Sentinel-2A/2B) have provided high-quality data for precision agriculture [164] and the deployment of drones has allowed the monitoring of crops even on a plant-level basis [22]. Drones offer higher resolution data and more readily available crop information [62], but both methods can be used to generate different vegetation indices and detect biotic and abiotic stress symptoms [90, 110, 179, 204]. Data processing and detection can be done with a good performance by using AI and ML methods [72].
The conventional method for monitoring, e.g., plant pathogens is to hire disease specialists who scout a field to identify the presence of diseases, but this approach is nowadays unappealing, given the amounts of labour and time required, especially in an era of decreasing workforce in agriculture [199]. An automated monitoring system would increase the spraying efficiency by conducting early and local detection of potential biotic stress, helping to reduce the application of pesticides [41].
In sum, the integration of developments in AI and advances in remote sensing technologies may have the potential to establish an economically viable, and environment-friendly, crop-stress forecasting system.
2.4 Yield Prediction and Estimation
Precise yield prediction is a crucial instrument for farmers to reduce costs and enhance harvest quality. The conventional approach is based on parameters such as historical data, weather conditions and time-consuming, inaccurate manual work such as gathering fruit samples [18, 127]. AI-based systems can automate these tasks by predicting or directly estimating the yield, can save time and provide better results [18].
Furthermore, crop yield prediction is not only an essential task at the farm management level, but (as a study shows) also for decision-makers at the national and regional scale [186]. This study also mentions that some of the most commonly used parameters for AI-based crop yield estimation are temperature, soil type, and rainfall, all of which carry challenges of their own including insufficient data availability and a lack of variety (e.g., different climatic conditions, different vegetation) when developing accurate models. These circumstances further complicate the integration of additional data sources and the final deployment in real farm management systems.
Recent examples of technological advances include a direct crop yield estimation, using computer vision, with promising results for fruits such as apples [187], citrus [42], and grapes [127].
An innovative approach uses satellite data in conjunction with parameters such as soil properties, weather information and data aquired by the combine harvester to develop predictive models for selected arable crops in Europe and the Americas. A successful end-to-end model has already been designed which can predict yields up to 120 days before harvest [36, 68].
In summary, reliable forecasts create an incentive for farmers to optimize parameters such as irrigation and fertilisation, opening up opportunities to cultivate crops in a more efficient and sustainable manner [68].
2.5 Harvesting
Harvesting can be subdivided into two main areas of interest: crop monitoring and the actual process of harvesting.
Crop monitoring helps determine the best time to harvest a crop, which is usually the time when the expected yield is optimal. High-value crops, however, may not ripen uniformly and thus not be ready for harvest all at the same time. This means harvesting is a continual task that typically requires manual and repetitive work which, driven by prices and labor shortages, poses a suitable application field for automation.
There is substantial work regarding the detection of single-crop fruits, their ripeness, and health. Promising results can be found for cherries [55], grape bunches [175], palm oil fruits [86], bananas [53], chili peppers [69] and strawberries [75]. Challenges include varying degrees of ripeness translating to variations in form, colour, and size, as well as occlusions caused by the environment or by the plants themselves.
For the physical process of harvesting, fewer and less readily integrated solutions can be found. Building upon the correct detection of fruits and their ripeness for harvest, the development of well-performing (soft) end-effectors and their application arises on top of the typical challenges of building functional robots for agriculture.
Prototypic and partial solutions have been described recently for tomatoes [84], pumpkins [155], sweet peppers [8], strawberries [201] and others. In their review on soft grippers for automated harvesting [131], the authors come to the conclusion that there is still much work to do.
There is a wide variety of sensors, effectors, processes, and sensorimotor control approaches necessary to carry out all the tasks collectively referred to as harvesting, caused by the wide variety of fruits that can be harvested. In all fairness, the same applies to human harvesters: strawberries, pumpkins, asparagus and maize, for example, all require different actions and different approaches. This variety is, of course, mirrored by the technology and the large amounts of papers reporting about its development, resulting in numerous and extensive approaches. We will revisit this feature in Sec. 3.4, where we discuss effectors in Robotics.
2.6 Organic Farming: A Special System
Organic farming plays a special role in agriculture and can be seen as an alternative to conventional, intensive farming. Established in Europe during the 1930 s and 1940s, organic foods have grown immensely in popularity and originated hundreds of certification bodies [151]. In 2021, the organic area share of total farmland was about 9.63% in the EU [154].
Although accompanied by a decrease in yields [92], the increasing public interest in organic farming systems is possibly a response to the perceived downsides of conventional farming. Despite its greatly increased crop production, labour efficiency, and thus productivity, conventional farming is also characterised by the use of external inputs like mineral fertilisers and chemical pesticides, resulting in a higher environmental pressure [129, 152]. Such external inputs are widely prohibited in organic farming systems, so the challenge is to solely rely on the principles of crop rotation, mechanical cultivation, manuring, and biological pest control (also applied in conventional agriculture). There is a strong focus on maximising nitrogen availability, the primary nutrient for plant growth, in some cases up to strict closed-cycle cropping systems and going beyond organic certification guidelines [151]. It is worth noting that, on average, organic agriculture results in higher soil organic matter content compared to conventional farming [106, 129].
Since organic farming systems are characterised by limited external inputs and heavily depend on the principles mentioned above, we can observe a decrease in yields, a more frequent application of conventional soil tillage to lower the weed pressure [141] and the use of questionable biological preparations such as copper to fight diseases [51]. These are all domains that may be strongly impacted by AI technologies.
Overall, AI may become a crucial technology to help overcome the weaknesses of organic farming and improve its yields. We understand that no single organic approach is guaranteed to feed the planet [151], and in this sense AI may serve as a bridge technology from conventional agriculture to the comparatively smaller organic market. Furthermore, ecopolitical regulations are one of the main reasons driving the need for AI innovation, as is the case with the “Farm-to-Fork-Strategy” within the EU Green Deal which aims to reduce chemical pesticide applications by 50% and expand organic agriculture to 25% area share of total farmland by 2030 [48].
2.7 Livestock Farming
Farm animals play an essential role in the biomass cycle, in which far more non-edible biomass is produced than biomass for human consumption. Sustainable food production cannot succeed without integrating animal production into the agricultural material cycle [194]. Animal husbandry is a particularly important factor in the transformation process of the agri-food system towards greater sustainability, and the use of AI in livestock farming supports this transformation [54].
The scope of application of AI includes animal identification, behaviour monitoring (feeding behaviour, aggression behaviour), health monitoring (disease detection, activity recognition), performance prediction and production technology in different livestock species (e.g., cattle, pigs, poultry) [14]. The meta-study showcases a wide spectrum of AI methods, including a variety of machine learning and deep learning approaches in e.g., computer vision.
In pig farming, AI can be used to detect upcoming births of piglets with high reliability, and to monitor the course of birth resulting in a significant reduction of number of piglets crushed by the sow. In the later fattening phase, AI methods can be used to detect and analyse the causes of tail biting. Individual animal identification is particularly important in order to be able to reliably select culprit animals. AI is also used to predict growth trends more accurately and to select animals by weight before slaughter [188]. Even if the use of AI in cattle and poultry farming has a different focus due to the differences between the species, the above-mentioned central topics are identical.
The AI application area in livestock farming is largely interdisciplinary. Recent approaches include not only the research areas of computer science, veterinary medicine, agricultural science, environmental sciences, mechanics and electronics, but ethics considerations are also taken into account [132].
In summary, AI has the potential to enable far-reaching improvements (and spread awareness) in animal welfare and holistic monitoring, and can help obtain better efficiency indicators.
Finally, the research field of explainable AI can become increasingly important in the context of livestock production. Farmers have a high level of responsibility in dealing with animals, and the degree of clarity and comprehensibility in the decision-making component of AI approaches is not only important for increasing acceptance [74], but it also allows humans to recognise and honour their responsibility towards farm animals.
3 AI Technologies for Agriculture
As previously mentioned, there are no recipes for how to solve challenging problems with specific AI techniques. We organise this section in the form of a several complementary (and often overlapping), high-level topics in AI that illustrate different foci of interest and levels of maturity. Often such technologies are used together to solve real-life problems, as exemplified in the last subsection dealing with agricultural robotics.
3.1 Vision and Perception
The availability of high-resolution spatial and temporal sensor data is rapidly spreading through the development of new sensors, reaching implements and mobile platforms such as autonomous field robots and UAVs. Interpreting such data is a core driver for innovation in agricultural processes.
The rising costs and severe shortage of skilled workers are among the factors steering agricultural innovation. This is countered by automation, which in turn creates a demand for high-quality sensor data.
This section contains an overview of vision and perception technologies. Different techniques for environment interpretation are presented and followed by requirements for the development of such techniques.
3.1.1 Vision Tasks
As pointed out in Sect. 2, many agricultural applications require a reliable model of the environment and suitable methods to interpret environment data. In autonomous agricultural machines or assistive systems, collecting information about the environment is of critical importance.
Localisation and navigation are among the core tasks that help guide a machine on a field (c.f. Sect. 3.4). Perception systems are used to support such tasks by detecting, for example, crops and rows [4, 11, 45, 78]. This could occur before harvesting or weeding processes and can rely on top-view images from drones [11, 142]. Real-time detection systems have been developed for decades, based on e.g., the Hough transformation [9], the random sample consensus (RANSAC) algorithm [195] or masks based on vegetation indices [4]. Furthermore, crop poses on semantic maps enable localisation with visual features [27]. Obstacle detection is essential for safety reasons, and agriculture is no exception. In contrast to autonomous driving on roads, classification of the obstacle is mandatory to distinguish between plants, which are to be harvested and therefore touched, and objects that might cause a harmful collision [29].
Pose estimation is another important task, necessary when objects need to be manipulated (e.g., by grippers, pickers or other special purpose end effectors) as is the case in applications such as weeding [112, 113, 118, 158, 198] as well as harvesting and fruit picking[7, 10, 117, 182].
While classical object detection and pose estimation algorithms often provide only axis-aligned bounding boxes, weeding requires tight boundaries between crops and weeds. Instance segmentation [113, 122, 158] addresses this problem by providing individual polygons on single plants or plant groups. RGB and NIR data are combined in the normalised difference vegetation index (NDVI)[17, 65, 158], and crops and weeds can be distinguished by local binary patterns (LBP) [17, 117] or covariance features [17].
As pointed out in Sect. 2.5, handling single fruits of high-value crops, like sweet pepper or strawberries, is important since their price directly correlates with their quality [7, 10, 182]. Pose estimation is fundamental for automated fruit harvesting [56, 139], so object detectors like SSD [137, 145] or instance segmentation algorithms like Mask R-CNN [56, 139] are often used together with RGB-D data. Harvesting selection is based on different quality parameters [35, 115], such as firmness and ripeness. This can be implemented through vision systems, albeit clutter and occlusions pose major challenges [139]. In [115], quality measures get derived from Near-Infrared bands (NIR). Firmness, for example, can be determined from hyperspectral data with regression models [35].
Other applications of sensor data processing include plant monitoring, for instance through a perception system that can track the growth state of a plant [99] or detect pests and diseases. This allows an early intervention and is often approached using convolutional neural networks (CNN)[169, 180, 199]. Region Proposal Networks like Faster R-CNN-based architectures [180] and augmentation learning [128] can be used to detect citrus diseases like black spot.
Although sensor data may be collected and interpreted in various ways and for different reasons, we can identify a common core of four main techniques:
-
Object detection, classification, and tracking
-
Instance segmentation
-
Feature extraction
-
Anomaly detection
3.1.2 Datasets
In recent years, many such models and algorithms were released and have continued to increase their levels of performance [205]. Perception algorithms are mostly trained for specific tasks and domains, and the underlying datasets constitute a determining factor on the resulting applicability and performance of algorithms. Publicly available datasets exist, which contain plenty of common objects in everyday scenes with some focus on urban or driving scenarios [49, 57, 108, 157]. In general, the development and evaluation of AI-based perception systems requires large amounts of data, together with a suitable ground truth. Preparing these datasets is a core challenge when bringing detection systems to the market. Especially in the agricultural sector, the prevailing variety of scenarios and environmental conditions must be taken into account.
In addition to common object datasets, off-road and agricultural scenarios have also been made available. Marulan [143], RELLIS-3D [79], NREC [144], Robot Unstructured Ground Driving (RUGD) [193] and FieldSAFE [94] in particular can be used to train and evaluate environment perception algorithms that deal with obstacle detection. For 3D Mapping and localisation, the Bacchus Long Term Dataset [146] contains RTK-GPS and IMU, as well as RGB-D cameras and 3D LIDAR data. The data was recorded in vineyards over an entire growing season. In agriculture, environmental conditions are often harsh and changing, due to weather and vegetation. These conditions are often just partly covered, instead of comprehensively for the whole operational design domain. For this reason, recording additional data is essential to develop and validate application systems suitable for commercial use. This can be done via sensors mounted on vehicles like tractors and trucks [93, 196, 153].
While data recording with vehicles offers flexibility, rail-guided test stands like AgroSafety [120] and AI-TEST-FIELD [95] are designed to achieve higher comparability between test runs under many different environmental conditions. For the comparability of different test runs, dummies are usually employed as test specimens to represent real obstacles. In order to standardise the evaluation, ISO 18497 [1] describes a barrel-like obstacle, which roughly compares to a seated person. A more realistic replicate of a human is proposed in ISO 19206-2 [2]. The reflectivity of test targets should represent the worst-case scenario, so [121] proposed a modified pedestrian test target based on ISO 19206-2 [2] with new cotton material.
In addition to the general perception of the surroundings, process-related perception algorithms are essential in agricultural domains and need to be trained accordingly. To that end, published datasets are available including the Swedish Leaf Dataset [173], which contains images of single leaves before a white background and can be used to classify different Swedish tree classes. Other datasets like Flavia [197], Leafsnap [96] and PlantVillage [126] contain images of plants under laboratory conditions. The Crop/Weed Field Image Dataset (CWFID) [64] contains 60 RGB and hyperspectral images, recorded by the autonomous field robot Bonirob in an organic carrot farm, that can be used for single plant phenotyping. The Vegetable Crops Dataset for Proximal Sensing (VCD) [100] contains RGB images of vegetable crops at an early stage of growth. The Plant-Doc dataset [171] contains around 2500 images of plant species with up to 17 classes of diseases. A crowd-sourced dataset with more than 50,000 images also exists, created on the PlantVillage platform [73]. Additionally, 375 pixel-wise labeled sugar beet and weed images (RGB and NIR) are published in the weedNet dataset [158].
Since there might not be free or public datasets available for every application, data recording becomes necessary in many cases but data acquisition comes with challenges of its own. Existing work includes that of Sankaran et al. [162], who provide a survey of sensor systems for phenotyping. Recording can be conducted on different devices such as UAVs [168], RC-Copters [26], airships [107], robots [184], self-propelling chairs [6, 33, 80], handcarts [12, 77, 192], spidercams [13, 89] or specific test stands [85].
In general, collecting data is a challenge and calls for special care. Data acquisition and labelling are costly and time-consuming tasks, especially when pixel-wise annotation is required as is the case in semantic instance segmentation. This can become very challenging very quickly if the labelling processes is manual, especially when dealing with occluded and young plants. Hence, synthetic training data has become more prominent, since images can be generated from models and the ground truth is known in advance for every pixel [16, 38, 77].
One last issue in dataset creation and usage that affects all topics in computer vision, are biases in the data [50], and agriculture is of course no exception [135]. Seasonal vegetation and irregular occurrences of diseases, pests, and weeds exacerbate the challenge.
3.2 Knowledge Representation and Reasoning
Knowledge representation addresses an important need of AI agents and AI-based technologies, whether physical or purely software-based: access to a formal (albeit potentially incomplete) model or description of their environment that can support processes such as querying, inference or action selection.
3.2.1 Ontologies and Semantic Web Technologies
Ontologies and knowledge graphs are commonly used representation methods with connections to semantic web technologies which allow these models to be stored and queried in a distributed manner. Examples spanning various sub-domains of agriculture include [21, 24, 66, 83, 104, 156]. A comprehensive overview can be found in a 2019 survey [43], where the authors argue that the increasing amount of sensor data in the agricultural domain has little benefit to the farmer in its raw form, and that its value lies in the insight gained by processing such data with suitable technologies.
Among the more representative semantic technology tools used in agriculture are AGROVOC [23], a type of thesaurus under the supervision of the Food and Agriculture Organization (FAO), the Crop Ontology [116], the Agro Portal [82] and, on the topic of plant science, AgroLD [103]. It should be noted, however, that most of these examples either refer to very specific subfields or take a very general approach to agriculture, and not much work exists that addresses the practical aspects in between. Some recent work also discusses the connection between ontology modelling and data mining, and provides an example application in crop farming [133].
Whether the input data is obtained automatically via mining and whether the result is an ontology or not, these models ultimately fulfil the need for a repository or “knowledge base”, often used in combination with an inference engine or some other information processing tool to generate useful or meaningful results.
3.2.2 Decision-Support Systems
A decision-support system (DSS) is a tool that relies on various AI techniques to process multiple types and sources of information, in order to provide useful and non-trivial insight that may inform the decisions made by an external agent. In agriculture, information sources might include a description of crops, fields, the soil and the weather forecast and the system may generate suggestions to facilitate e.g., farm-level management. DSS technologies and expert systems in general have a wide array of applications but, as some studies show, agriculture was not a primary focus at least until the turn of the current century: in a survey spanning the period between 1995 and 2001, the authors found only a couple of agricultural applications [46]. Furthermore, these few examples constituted more traditional computer programs than knowledge- or inference-based AI systems. Since then, and perhaps as a result of considerable improvements in the acquisition and storage of large amounts of data, there appears to be some renewed interest, particularly when the inference component of a DSS is combined with automated data analysis and machine learning tools.
DSSAT is a current example that focuses on Ag Tech transfer and is capable of simulating crop growth, development and yield [70, 71]. Other existing systems provide support in a variety of agricultural tasks ranging from farm operation through scheduling and optimisation [150], irrigation activities based on fuzzy inference [60] or a fuzzy neural network [130], as well as impact assessment with respect to sustainability and climate change [163, 191]. A more in-depth discussion of these and other support systems can be found in a recent survey [203]. In this paper, the authors compare the performance and functionality of several agricultural DSSs in the era of remote sensing, large amounts of data, cloud computing and artificial intelligence, which is collectively referred to as “Agriculture 4.0”.
3.2.3 Planning and Optimisation
From a functional perspective, planners and decision-support systems are both AI technologies that can inform human activities, but a DSS is limited to generating suggestions or contextual information while a planning problem is solved by identifying an optimal sequence of actions, meant to be executed. Similar to planning, optimisation approaches generate optimal value assignments that may be pursued or implemented directly.
An exemplary approach is the unmanned monitoring of crop health utilising a UAV capable of choosing its own paths and targets, in order to perform a close inspection, apply herbicide or collect higher-resolution images [5]. Work also exists on route planning and optimisation that can guide auto-steering and navigation-aiding systems [20]. Another approach uses a combination of aerial and ground vehicles, where information about crops and weeds is gathered from the air and used to plan possible interventions, reducing the application of herbicide [30].
AI-adjacent topics such as control optimisation may also have an impact in agricultural applications, as exemplified by a drip irrigation system that approaches moisture transport in unsaturated soil as a linear optimisation problem, providing correct amounts of water [91]. Another optimising approach focuses on agricultural resource management and suggests optimal land allocation for diversified crop planning, relying on combinatorial methods that consider socioeconomic and environmental objectives as well as farm- and district-level goals [160].
Finally, a recent paper shows that the application of optimisation methods through multiple AI technologies can lead to more efficient crop protection by applying more suitable products at the right time, in the right amounts and only where necessary, which also may contribute to reaching the United Nations Sustainable Development Goals of responsible consumption and production [166].
Planning and optimisation techniques are commonly used as components of autonomous robotics and agricultural vehicle platforms, discussed in more detail in Sect. 3.4.
3.3 Data Acquisition and Analysis
During nearly all agricultural activities, from tillage to seeding and fertilising as well as weeding and harvesting, large amounts of data may be gathered and stored. Decisions, actions or other generated knowledge, as described in Sec. 3.2, all rely on input from field data and are supported by additional intermediate data processing [159]. Several selected approaches to interpreting sensor raw data have already been discussed in Section 3.1. Machine learning techniques play a fundamental role throughout all of these stages of data analysis and information processing.
Multiple implements are equipped with telemetry units and sensors like opto-electronic devices [138]. This data stock can be enhanced by remote sensing as well as separate IoT sensors (e.g., soil moisture sensors). See [172] for an an overview of remote sensing. While satellites provide data on a regular (up to daily) basis, the spatial resolution is limited [123] and data acquisition must be extended with (manual) drone flights. Despite being time-consuming and hence not performed as frequently, drones can gather data with a significantly higher spatial resolution.
As mentioned in Section 3.1, RGB data and NIR spectrum are relevant for phenotyping and classification between crop and weed. In addition, they are used to derive biomass, chlorophyll, and nitrogen among others. An overview of relevant indices, like NDVI, and their applications are given in [172].
In agricultural research there are a number of ongoing activities regarding the use of ML methods to predict crop yield [28], but these tasks can be very challenging due to the inherent uncertainty from factors such as the weather. In [203] it is pointed out that meteorological forecast is relevant to support agricultural activities, as its influence may be reflected in drought [87] and groundwater level [165]. Through an analysis of properties such as moisture [88] and nutrients [28], a basis for fertilisation and irrigation activities can be laid out.
3.4 Robotics
Robotics is a distinct technology with very clear applications in agriculture. The research, development, and prototyping of well-integrated machines that can automate agricultural tasks or execute entire processes autonomously goes back decades. While this chapter concentrates on the more recent developments, a source like [18] can be used to obtain a broader overview.
Applications of robotics technology can take various forms: (Partly) automated versions of conventional machines such as harvesters or tractors fulfil the same tasks as always without the need of a driver; In a different approach, integrated mobile robots, rail systems with robotic arms in greenhouses or drones can have the same appearance and role as conventional machines or differ vastly in both aspects.
The distinction between such concepts is by no means clear cut and many technologies can be transferred from one to another. It is also important to note that while there is still much ongoing research in the scientific community, a significant portion of the development is driven by industry with many platforms already commercially available.
3.4.1 Integrated Robotic systems
Field robots are autonomous systems oftentimes described as the technological means to meet ambitious goals, regarding sustainability and development of the agricultural sector [181]. Robots are also a particularly good choice for the application of AI as they can, in principle, address many agricultural domains and integrate – to some degree – many if not all of the technologies provided by AI [97]. In the context of the very active field of object detection with machine learning techniques, robotics can be seen as the actuating counterpart to the advances in perception and sensing [161]. The large amount of papers on robotics applications, however, makes it difficult to provide an exhaustive overview of a field as active as agricultural robotics.
Topics of special importance to mobile robotics include localisation and path planning, sensing, and actuating, which can be found at different levels of technological maturity [200, 202].
In the last years, a rich account of surveys has been published hinting at a plenitude of prototypes, in general, [81, 149, 189] and with a focus on monitoring and phenotyping [15, 52, 76, 147], seeding[52], weeding [47, 52, 63], and harvesting [8, 52]. Fundamentally, it can be observed, that recent developments in image processing have led to an enormous increase in work related to monitoring and detection use cases.
Actuation on the other hand, and therefore fully integrated farming robots, accounts for much less published articles as the required mechanical components are still under development. Especially in the fields of weeding [17, 119, 148, 185] and harvesting [7, 167, 170] (see also 3.4.2 below), there are some recent examples of working prototypes in both science and industry.
It is worth mentioning that the ideal of fully automating food production on farms is not devoid of social and ethical considerations. Farming continues to play a fundamental role in the course of societies worldwide, and continues to involve small stakeholders [40, 176]. The farmers’ own perspectives are of great importance, as robots are expected to become commercially available. Work such as [177] becomes particularly interesting also from the technological perspective, as issues regarding human-robot collaboration, but also the choice of size and a fitting role for robots on farms will depend on commercial demand.
Additionally, integrating multiple robots in swarms as well as enabling and orchestrating cooperation between robots, and between humans and robots, seems to be a promising trend as suggested by several studies [3, 40, 114].
An additional perspective, albeit not yet as visible in the community, is that of the expected long-term behaviour of autonomous systems in agriculture. Robots in this context have to deal with a dynamic, partially controllable, and partially observable environment. When tasks take long enough to complete that such dynamics become relevant, for instance when tending to a large field, or when tasks must be repeated multiple times, as is the case of weed management, dealing with these aspects becomes a complementary but no less important challenge that poses many interesting research questions. In 2017, some aspects of long-term autonomy were examined in an indoor setting [67], while the authors of [97] describe different technologies necessary to enable long-term capabilities of robots in different contexts, including agriculture. In these examples, long-term autonomy is described as a central challenge based on the integration of various building blocks.
3.4.2 End-Effectors
Physical interactions between robots and the environment demand special attention and the development of well-suited tools. If conventional implements are to be used, this is often a matter of interfaces and control; when a robot is supposed to fulfill a task typically performed manually, like the plucking of larger fruit, new types of tools or approaches are necessary.
As mentioned in section 2.5, there is a wealth of ongoing research on special end-effectors and their integration into fruit harvesting robots operating in both fields a greenhouse contexts. Examples include tomatoes [84], pumpkins [155], sweet pepper [8], strawberries [201] and others, although many questions remain unanswered so far what harvesting strategy and which tools will prove best. Soft end-effectors which are experimented on but not yet widely used might prove as a promising avenue for further investigation [131].
Judging by the abundance of different individual solutions for different fruits, it seems that it is not easy to transfer robotic systems from one use case to another. This might have to do with the detection systems or individual requirements of the end-effectors and future research seems worthwhile.
3.4.3 Unmanned Aerial Vehicles (Drones)
Although often remote-controlled by humans during critical phases, such as take-off and landing –or even for their entire deployment–, drones are also associated with robotics. With programmable missions, collision avoidance systems, and automated path planning drones become unmanned vehicles, a type of autonomous robot.
In agriculture, drones are typically used as low-cost but efficient sensing platforms, especially for RGB aerial imagery but other sensor data such as lidar or multi-spectral may also be collected, although less commonly. Overlapping with the field of remote sensing, drone images are used for plant monitoring, especially for pest and disease control [32, 37, 47, 109, 134, 167]. Although less frequently due to challenges in engineering and administration, drones can also be used to precisely spray pesticides onto small patches to drastically reduce overall agrochemical usage [61].
3.4.4 Commercial Agricultural Robotics
Although originally dominated by academic institutions, in the last decades the industrial sector has also joined the field of robotics research and prototyping. Small and large corporations alike have contributed their own innovations and initiated collaboration efforts with the scientific community. Websites like ducksize.com,Footnote 3 industry-oriented conferences like World Fira, or the robotics components in Ag-Tech fairs such as Agritechnica can offer a good starting point to showcase agricultural robots, in the form of products and prototypes both currently available and under development. As previously mentioned, robotic systems come in many shapes and configurations, but a coarse distinction can be observed between hybrid or upgraded conventional machines, large robots without driver cabins, and small or very small robots that might be used in newer and less conventional ways.
Despite the progress illustrated by such products and prototypes, many open questions still remain regarding architecture, data sovereignty, and interfacing, in addition to technological challenges. Especially in larger machines, which are meant to be coupled with different farming implements for multiple farming tasks, there are several open problems related to interfacing, compatibility, and automated decision-making in addition to concerns about functional safety.
4 Conclusion – The Role of AI in Future Agricultural Innovations
The introduction of this paper has argued that AI fits agriculture and, in particular, Ag Tech, based on the typical understanding of (applied, “narrow”) AI as it is formulated in the OECD definition of AI Systems. We have surveyed a subset of the substantial body of literature that exists about AI in agriculture applications. To keep the survey manageable in size, we have focused on arable farming and on farming as it is currently practised in Central Europe, giving short hints to topics beyond this narrow focus, like animal husbandry or organic farming. Extending the process view to other farming systems and to worldwide farming products and procedures would have revealed additional examples of AI applications, but it would not have changed the main point of our argument: that AI fits Ag Tech, that it is currently applied in Ag Tech machinery, and that there is ongoing research and development with future innovations already underway.
In conclusion of this survey, widening the scope about the role of AI in future Ag Tech should not be restricted to listing new or additional applications (as was the focus here), but could include potential AI-based improvements in current agricultural practises (be it on a field, in a barn, in greenhouses or elsewhere) all of which are useful involvements of AI in Ag Tech. Examples include providing operator assistance in current Ag machinery and processes, making use of data from current machines for improving work efficiency by economic or ecological benefits or by reducing human labour, which is hard to come by already as much of the manual workforce is shifting away from agricultural jobs.
Yet, there is another potential use of AI in agriculture, which may be even more influential for the future of agricultural practise: AI that enables autonomous machinery capable of uninterrupted, continuous work or with the ability to make recommendations based on analysing data at both a rate and scale far beyond human ability. These applications may enable agricultural processes and practices that differ substantially from those that we have today, which have co-evolved with more traditional technology (without AI) over the past decades (again, in Central Europe, but elsewhere in the world, too). To meet the agri-food-related Sustainable Development Goals (SDGs) of the United Nations, and do so in a changing world climate, it appears that adapting current agricultural practices, processes and technology is a necessity more than a choice.
AI can or should play a part in this as an enabler of such new processes, although the actual implementation or deployment should be led by agricultural experts. AI can now provide methods and technology not readily available when the current Ag Tech emerged, and has the potential to make machines more adaptive, more autonomous, smaller and easier to deploy in harsh environments. Consequently, AI also has the potential to develop agricultural systems that can raise economical, ecological, and societal standards of food production worldwide. AI may take away workload from farmers, yet grant them control and responsibility for their own farms. We’d argue that the AI research community is well aware, perhaps more than anybody else, that providing the technology to achieve such goals is extremely difficult. The possible contributions to achieving the UN SDGs, however, appear be worth the effort.
Data availability
No data is used in this survey article.
Notes
“An AI system is a machine-based system that is capable of influencing the environment by making recommendations, predictions or decisions for a given set of objectives. It does so by utilising machine and/or human-based inputs/data to: i) perceive real and/or virtual environments; ii) abstract such perceptions into models manually or automatically; and iii) use model interpretations to formulate options for outcomes.“
https://www.scopus.com search prompt TITLE-ABS-KEY ( ( “Artificial Intelligence” OR “Cognitive Computing” OR “Expert System” OR “Machine Learning” OR “Deep learning” OR “Neural Network” OR “Decision Support System” OR “Learning Algorithm” OR “Learning Systems” OR “Algorithmic” OR “Supervised learning” OR “Unsupervised learning” OR “Reinforcement learning” ) AND “Agriculture” )
https://www.ducksize.com as of October 2023.
References
18497:2018–11 I, (2018) Agricultural machinery and tractors - Safety of highly automated agricultural machines - Principles for design. Beuth Verlag, Berlin
19206–2:2018–12 I, (2018) Road vehicles - Test devices for target vehicles, vulnerable road users and other objects, for assessment of active safety functions - Part 2: Requirements for pedestrian targets. Beuth Verlag, Berlin
Afrin M, Jin J, Rahman A et al (2021) Resource allocation and service provisioning in multi-agent cloud robotics: a comprehensive survey. IEEE Commun Surv Tutor. https://doi.org/10.1109/COMST.2021.3061435
Ahmadi A, Nardi L, Chebrolu N et al (2020) Visual servoing-based navigation for monitoring row-crop fields. IEEE In Conf Robot Autom (ICRA). https://doi.org/10.1109/ICRA40945.2020.9197114
Alsalam BHY, Morton K, Campbell D, et al (2017) Autonomous uav with vision based on-board decision making for remote sensing and precision agriculture. In: 2017 IEEE Aerospace Conference, pp 1–12. https://doi.org/10.1109/AERO.2017.7943593
Andrade-Sanchez P, Gore MA, Heun JT et al (2014) Development and evaluation of a field-based high-throughput phenotyping platform. Funct Plant Biol 41(1):68. https://doi.org/10.1071/FP13126
Arad B, Balendonck J, Barth R et al (2020) Development of a sweet pepper harvesting robot. J Field Robot 37(6):1027–1039. https://doi.org/10.1002/rob.21937
Arad B, Balendonck J, Barth R et al (2020) Development of a sweet pepper harvesting robot. J Field Robot. https://doi.org/10.1002/rob.21937
Åstrand B, Baerveldt AJ (2005) A vision based row-following system for agricultural field machinery. Mechatronics 15(2):251–269. https://doi.org/10.1016/j.mechatronics.2004.05.005
Bac CW, van Henten EJ, Hemming J et al (2014) Harvesting robots for high-value crops: state-of-the-art review and challenges ahead. J Field Robot 31(6):888–911. https://doi.org/10.1002/rob.21525
Bah MD, Hafiane A, Canals R (2020) Crownet: deep network for crop row detection in uav images. IEEE Access 8:5189–5200. https://doi.org/10.1109/ACCESS.2019.2960873
Bai G, Ge Y, Hussain W et al (2016) A multi-sensor system for high throughput field phenotyping in soybean and wheat breeding. Comput Electron Agric 128:181–192. https://doi.org/10.1016/j.compag.2016.08.021
Bai G, Ge Y, Scoby D et al (2019) NU-Spidercam: a large-scale, cable-driven, integrated sensing and robotic system for advanced phenotyping, remote sensing, and agronomic research. Comput Electron Agric 160:71–81. https://doi.org/10.1016/j.compag.2019.03.009
Bao J, Xie Q (2022) Artificial intelligence in animal farming: a systematic literature review. J Clean Prod 331(129):956. https://doi.org/10.1016/j.jclepro.2021.129956
Bao Y, Gai J, Xiang L et al (2021) Field robotic systems for high-throughput plant phenotyping: A review and a case study. In: Zhou J, Nguyen HT (eds) High throughput crop phenotyping. Springer International Publishing, pp 13–38
Barth R, Isselmuiden J, Hemming J et al (2018) Data synthesis methods for semantic segmentation in agriculture: a Capsicum annuum dataset. Comput Electron Agric 144:284–296. https://doi.org/10.1016/j.compag.2017.12.001
Bawden O, Kulk J, Russell R et al (2017) Robot for weed species plant-specific management. J Field Robot 34(6):1179–1199. https://doi.org/10.1002/rob.21727
Bergerman M, Billingsley J, Reid J et al (2016) Robotics in agriculture and forestry. Springer. https://doi.org/10.1007/978-3-319-32552-1_56
Birkás M, Dekemati I, Kende Z et al (2017) Review of soil tillage history and new challenges in hungary. Hung Geograph Bull 66(1):55–64. https://doi.org/10.15201/hungeobull.66.1.6
Bochtis DD, Sørensen CG, Green O (2012) A dss for planning of soil-sensitive field operations. Decis Support Syst 53(1):66–75
Bonacin R, Nabuco OF, Junior IP (2016) Ontology models of the impacts of agriculture and climate changes on water resources: Scenarios on interoperability and information recovery. Futur Gener Comput Syst 54:423–434
Boursianis AD, Papadopoulou MS, Diamantoulakis P et al (2022) Internet of things (iot) and agricultural unmanned aerial vehicles (uavs) in smart farming: a comprehensive review. Internet of Things 18(100):187. https://doi.org/10.1016/j.iot.2020.100187
Caracciolo C, Stellato A, Morshed A et al (2013) The agrovoc linked dataset. Sem Web 4(3):341–348
Celli F, Malapela T, Wegner K, et al (2015) Agris: providing access to agricultural research data exploiting open data on the web. F1000Research 4
Chang CL, Xie BX, Chung SC (2021) Mechanical control with a deep learning method for precise weeding on a farm. Agriculture 11(11):1049. https://doi.org/10.3390/agriculture11111049
Chapman SC, Merz T, Chan A et al (2014) Pheno-copter: a low-altitude, autonomous remote-sensing robotic helicopter for high-throughput field-based phenotyping. Agronomy 4(2):279–301. https://doi.org/10.3390/agronomy4020279
Chebrolu N, Lottes P, Labe T et al (2019) Robot Localization Based on Aerial Images for Precision Agriculture Tasks in Crop Fields. 2019 International Conference on Robotics and Automation (ICRA). IEEE, Montreal, QC, Canada, pp 1787–1793
Chlingaryan A, Sukkarieh S, Whelan B (2018) Machine learning approaches for crop yield prediction and nitrogen status estimation in precision agriculture: A review. Comput Electron Agric 151:61–69. https://doi.org/10.1016/j.compag.2018.05.012
Christiansen P, Nielsen L, Steen K et al (2016) DeepAnomaly: Combining Background Subtraction and Deep Learning for Detecting Obstacles and Anomalies in an Agricultural Field. Sensors 16(11):1904. https://doi.org/10.3390/s16111904
Conesa-Muñoz J, Valente J, Del Cerro J et al (2016) A multi-robot sense-act approach to lead to a proper acting in environmental incidents. Sensors 16(8):1269
Corceiro A, Alibabaei K, Assunção E et al (2023) Methods for detecting and classifying weeds, diseases and fruits using ai to improve the sustainability of agricultural crops: A review. Processes 11(4):1263. https://doi.org/10.3390/pr11041263
Dainelli R, Toscano P, Di Gennaro S et al (2021) Recent advances in unmanned aerial vehicles forest remote sensing-a systematic review part ii: research applications. Forests. https://doi.org/10.3390/f12040397
Deery D, Jimenez-Berni J, Jones H et al (2014) Proximal remote sensing buggies and potential applications for field-based phenotyping. Agronomy 4(3):349–379. https://doi.org/10.3390/agronomy4030349
Demmel M, Kirchmeier H, Brandhuber R (2014) Konservierende Bodenbearbeitung - Technische Lösungen. VDI-MEG p 12
Devassy B, George S (2021) Estimation of strawberry firmness using hyperspectral imaging: a comparison of regression models. J Spectr Imaging. https://doi.org/10.1255/jsi.2021.a3
DFKI Kaiserslautern (2022) Yield Consortium: Die Fernerkundung aus dem All für die Landwirtschaft. https://www.dfki.de/web/news/yield-consortium-fernerkundung-aus-dem-all-fuer-landwirtschaft [Accessed: (23.10.2023)]
Dhaka V, Meena S, Rani G et al (2021) A survey of deep convolutional neural networks applied for prediction of plant leaf diseases. Sensors. https://doi.org/10.3390/s21144749
Di Cicco M, Potena C, Grisetti G, et al (2017) Automatic model based dataset generation for fast and accurate crop and weeds detection. In: 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp 5188–5195. https://doi.org/10.1109/IROS.2017.8206408
Diepenbrock W, Ellmer F, Léon J (2016) Ackerbau. Grundwissen Bachelor. UTB, Pflanzenbau und Pflanzenzüchtung. https://doi.org/10.36198/9783838546070
Ditzler L, Driessen C (2022) Automating agroecology: how to design a farming robot without a monocultural mindset? J Agric Environ Ethics. https://doi.org/10.1007/s10806-021-09876-x
Dong Y, Xu F, Liu L et al (2020) Automatic system for crop pest and disease dynamic monitoring and early forecasting. IEEE J Sel Top Appl Earth Obs Remote Sens 13:4410–4418. https://doi.org/10.1109/JSTARS.2020.3013340
Dorj UO, Lee M, Ss Yun (2017) An yield estimation in citrus orchards via fruit detection and counting using image processing. Comput Electron Agric 140:103–112. https://doi.org/10.1016/j.compag.2017.05.019
Drury B, Fernandes R, Moura MF et al (2019) A survey of semantic web technology for agriculture. Inform Process Agric 6(4):487–501
Eli-Chukwu NC (2019) Applications of artificial intelligence in agriculture: a review. Eng Technol Appl Sci Res. https://doi.org/10.48084/etasr.2756
English A, Ross P, Ball D, et al (2014) Vision based guidance for robot navigation in agriculture. In: 2014 IEEE International Conference on Robotics and Automation (ICRA), pp 1693–1698. https://doi.org/10.1109/ICRA.2014.6907079
Eom S, Kim E (2006) A survey of decision support system applications (1995–2001). J Oper Res Soc 57:1264–1278
Esposito M, Crimaldi M, Cirillo V et al (2021) Drone and sensor technology for sustainable weed management: a review. Chem Biol Technol Agric. https://doi.org/10.1186/s40538-021-00217-8
European Comission (2020) From Farm to Fork: Our food, our health, our planet, our future. https://ec.europa.eu/commission/presscorner/api/files/attachment/874820/Farm%20to%20fork_EN_2023.pdf.pdf [Accessed: (30.10.2023)]
Everingham M, Eslami SMA, Van Gool L et al (2015) The pascal visual object classes challenge: a retrospective. Int J Comput Vision 111(1):98–136. https://doi.org/10.1007/s11263-014-0733-5
Fabbrizzi S, Papadopoulos S, Ntoutsi E et al (2022) A survey on bias in visual datasets. Comput Vis Image Underst 223(103):552. https://doi.org/10.1016/j.cviu.2022.103552
Finckh MR, Hayer F, Schulte-Geldermann E et al (2008) Diversität, pflanzenernährung und prognose: ein integriertes konzept zum management der kraut-und knollenfäule in der ökologischen landwirtschaft. Gesunde Pflanz 60(4):159. https://doi.org/10.1007/s10343-008-0192-4
Fountas S, Mylonas N, Malounas I et al (2020) Agricultural robotics for field operations. Sensors. https://doi.org/10.3390/s20092672
Fu L, Yang Z, Wu F et al (2022) YOLO-banana: a lightweight neural network for rapid detection of banana bunches and stalks in the natural environment. Agronomy 12(2):391
Fuentes S, Gonzalez Viejo C, Tongson E et al (2022) The livestock farming digital transformation: implementation of new and emerging technologies using artificial intelligence. Anim Health Res Rev 23(1):59–71. https://doi.org/10.1017/S1466252321000177
Gai R, Chen N, Yuan H (2023) A detection algorithm for cherry fruits based on the improved YOLO-v4 model. Neural Comput Appl 35(19):895. https://doi.org/10.1007/s00521-021-06029-z
Ge Y, Xiong Y, Tenorio GL et al (2019) Fruit Localization and Environment Perception for Strawberry Harvesting Robots. IEEE Access, vol 7, pp 147642–147652. https://doi.org/10.1109/ACCESS.2019.2946369
Geiger A, Lenz P, Urtasun R (2012) Are we ready for autonomous driving? The KITTI vision benchmark suite. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition, pp 3354–3361. https://doi.org/10.1109/CVPR.2012.6248074
Gerhards R, Andujar Sanchez D, Hamouz P et al (2022) Advances in site-specific weed management in agriculture-a review. Weed Res 62(2):123–133. https://doi.org/10.1111/wre.12526
Gimenez E, Salinas M, Manzano-Agugliaro F (2018) Worldwide research on plant defense against biotic stresses as improvement for sustainable agriculture. Sustainability 10(2):391. https://doi.org/10.3390/su10020391
Giusti E, Marsili-Libelli S (2015) A fuzzy decision support system for irrigation and water conservation in agriculture. Environ Model oftw 63:73–86
Hafeez A, Husain M, Singh S et al (2022) Implementation of drone technology for farm monitoring & pesticide spraying: a review. Informa Process Agric. https://doi.org/10.1016/j.inpa.2022.02.002
Hafeez A, Husain MA, Singh S et al (2022) Implementation of drone technology for farm monitoring & pesticide spraying: a review. Inform process Agricu. https://doi.org/10.1016/j.inpa.2022.02.002
Hasan A, Sohel F, Diepeveen D et al (2021) A survey of deep learning techniques for weed detection from images. Comput Electron Agric. https://doi.org/10.1016/j.compag.2021.106067
Haug S, Ostermann J (2015) A crop/weed field image dataset for the evaluation of computer vision based precision agriculture tasks. In: Computer Vision - ECCV 2014 Workshops, pp 105–116. https://doi.org/10.1007/978-3-319-16220-1_8
Haug S, Biber P, Michaels A, et al (2014) Plant Stem Detection and Position Estimation using Machine Vision. In: Workshop Proc. of Conf. on Intelligent Autonomous Systems (IAS), pp 483–490. https://api.semanticscholar.org/CorpusID:231630772
Haverkort A, Top J (2011) The potato ontology: delimitation of the domain, modelling concepts, and prospects of performance. Potato Res 54:119–136
Hawes N, Burbridge C, Jovan F et al (2017) The strands project: long-term autonomy in everyday environments. IEEE Robot Autom Magaz. https://doi.org/10.1109/MRA.2016.2636359
Helber P, Bischke B, Habelitz P, et al (2023) Crop yield prediction: An operational approach to crop yield modeling on field and subfield level with machine learning models. In: IGARSS 2023 - 2023 IEEE International Geoscience and Remote Sensing Symposium, pp 2763–2766. https://doi.org/10.1109/IGARSS52108.2023.10283302
Hespeler SC, Nemati H, Dehghan-Niri E (2021) Non-destructive thermal imaging for object detection via advanced deep learning for robotic inspection and harvesting of chili peppers. Artif Intell Agric 5:102–117. https://doi.org/10.1016/j.aiia.2021.05.003
Hoogenboom G, Jones J, Wilkens P, et al (2004) Decision support system for agrotechnology transfer version 4.0. University of Hawaii, Honolulu, HI (CD-ROM)
Hoogenboom G, Porter CH, Boote KJ, et al (2019) The dssat crop modeling ecosystem. In: Advances in crop modelling for a sustainable agriculture. Burleigh Dodds Science Publishing, p 173–216
Houetohossou SCA, Houndji VR, Hounmenou CG et al (2023) Deep learning methods for biotic and abiotic stresses detection and classification in fruits and vegetables: state of the art and perspectives. Artif Intell Agric. https://doi.org/10.1016/j.aiia.2023.08.001
Hughes DP, Salathe M (2015) An open access repository of images on plant health to enable the development of mobile disease diagnostics. arXiv e-prints. https://arxiv.org/abs/arXiv:1511.08060 [cs.CY]
Hüllmann JA (2022) Explainable ai in farming: Configurations of human-ai joint decision-making. In: Farming: Configurations of Human-AI Joint Decision-Making (June 01, 2022). Proceedings of Reshaping Work 2022 Conference. https://doi.org/10.2139/ssrn.4224804
Ilyas T, Khan A, Umraiz M, et al (2021) Multi-scale context aggregation for strawberry fruit recognition and disease phenotyping. IEEE Access, vol 9, pp 124491–124504. https://doi.org/10.1109/ACCESS.2021.3110978
Iqbal J, Xu R, Sun S et al (2020) Simulation of an autonomous mobile robot for LiDAR-based in-field phenotyping and navigation. Robotics. https://doi.org/10.3390/robotics9020046
Iqbal N, Bracke J, Elmiger A, et al (2023) Evaluating synthetic vs. real data generation for AI-based selective weeding. In: Hoffmann C, Stein A, Ruckelshausen A, et al (eds) 43. GIL-Jahrestagung, Resiliente Agri-Food-Systeme. Gesellschaft für Informatik e.V., Bonn, pp 125–135
Jiang GQ, Zhao CJ, Si YS (2010) A machine vision based crop rows detection for agricultural robots. In: 2010 International Conference on Wavelet Analysis and Pattern Recognition, pp 114–118. https://doi.org/10.1109/ICWAPR.2010.5576422
Jiang P, Osteen P, Wigness M, et al (2021) Rellis-3d dataset: Data, benchmarks and analysis. In: 2021 IEEE International Conference on Robotics and Automation (ICRA), pp 1110–1116. https://doi.org/10.1109/ICRA48506.2021.9561251
Jiang Y, Li C, Robertson JS et al (2018) GPhenoVision: a ground mobile system with multi-modal imaging for field-based high throughput phenotyping of cotton. Sci Rep 8(1):1213. https://doi.org/10.1038/s41598-018-19142-2
Jin Y, Liu J, Xu Z et al (2021) Development status and trend of agricultural robot technology. Int J gric Biol Eng. https://doi.org/10.25165/j.ijabe.20211404.6821
Jonquet C, Toulet A, Arnaud E et al (2018) Agroportal: a vocabulary and ontology repository for agronomy. Comput Electron Agric 144:126–143
Joo S, Koide S, Takeda H, et al (2016) Agriculture activity ontology: An ontology for core vocabulary of agriculture activity. In: ISWC (Posters & Demos)
Jun J, Kim J, Seol J, et al (2021) Towards an efficient tomato harvesting robot: 3d perception, manipulation, and end-effector. IEEE Access, vol 9, pp 17631–17640. https://doi.org/10.1109/ACCESS.2021.3052240, conference Name: IEEE Access
Junker A, Muraya MM, Weigelt-Fischer K et al (2015) Optimizing experimental procedures for quantitative evaluation of crop plant performance in high throughput phenotyping systems. Front Plant Sci. https://doi.org/10.3389/fpls.2014.00770
Junos MH, Mohd Khairuddin AS, Thannirmalai S et al (2022) Automatic detection of oil palm fruits from UAV images using an improved YOLO model. Vis Comput 38(7):2341–2355. https://doi.org/10.1007/s00371-021-02116-3
Kan JC, Ferreira CSS, Destouni G et al (2023) Predicting agricultural drought indicators: ML approaches across wide-ranging climate and land use conditions. Ecol Ind 154(110):524. https://doi.org/10.1016/j.ecolind.2023.110524
Khanal S, Kc K, Fulton JP et al (2020) Remote sensing in agriculture-accomplishments, limitations, and opportunities. Remote Sens 12(22):3783. https://doi.org/10.3390/rs12223783
Kirchgessner N, Liebisch F, Yu K et al (2017) The ETH field phenotyping platform FIP: a cable-suspended multi-sensor system. Funct Plant Biol 44(1):154. https://doi.org/10.1071/FP16165
Kitpo N, Inoue M (2018) Early rice disease detection and position mapping system using drone and iot architecture. In: 2018 12th South East Asian Technical University Consortium (SEATUC), IEEE, pp 1–5. https://doi.org/10.1109/SEATUC.2018.8788863
Klyushin D, Tymoshenko A (2021) Optimization of drip irrigation systems using artificial intelligence methods for sustainable agriculture and environment. Theory, practice and future applications, Artificial intelligence for sustainable development, pp 3–17
Knapp S, van der Heijden MG (2018) A global meta-analysis of yield stability in organic and conservation agriculture. Nat Commun 9(1):3632. https://doi.org/10.1038/s41467-018-05956-1
Kragh MF, Christiansen P, Laursen MS et al (2017) FieldSAFE: dataset for obstacle detection in agriculture. Sensors 17(11):2579. https://doi.org/10.3390/s17112579
Kragh MF, Christiansen P, Laursen MS et al (2017) Fieldsafe: dataset for obstacle detection in agriculture. Sensors. https://doi.org/10.3390/s17112579
Krause JC, Martinez J, Gennet H, et al (2023) AI-TEST-FIELD - An Agricultural Test Environment for Semantic Environment Perception with Respect to Harsh And Changing Environmental Conditions. In: 2023 ASABE Annual International Meeting. ASABE, St. Joseph, MI, ASABE Paper No. 2300757, p 1
Kumar N, Belhumeur PN, Biswas A, et al (2012) Leafsnap: A computer vision system for automatic plant species identification. In: Fitzgibbon A, Lazebnik S, Perona P, et al (eds) Computer Vision – ECCV 2012. Springer Berlin Heidelberg, Berlin, Heidelberg, pp 502–516. https://doi.org/10.1007/978-3-642-33709-3_36
Kunze L, Hawes N, Duckett T et al (2018) Artificial intelligence for long-term robot autonomy: a survey. IEEE Robot Autom Lett. https://doi.org/10.1109/LRA.2018.2860628
Kuratorium für Technik und Bauwesen (KTBL) (2015) KTBL Fachartikel Bodenbearbeitung und Bestellung. https://www.ktbl.de/fileadmin/user_upload/Artikel/Pflanzenbau/Bodenbearbeitung/Bodenbearbeitung_und_Bestellung_2015.pdf [Accessed: (24.10.2023)]
Kusumam K, Krajník T, Pearson S et al (2017) 3D-vision based detection, localization, and sizing of broccoli heads in the field. J Field Robot 34(8):1505–1518. https://doi.org/10.1002/rob.21726
Lac L, Keresztes B, Louargant M et al (2022) An annotated image dataset of vegetable crops at an early stage of growth for proximal sensing applications. Data Brief 42(108):035. https://doi.org/10.1016/j.dib.2022.108035
Landwirtschaftskammer Nordrhein-Westfalen (2009) Bodenbearbeitungsverfahren, Ratgeber 2009. https://web.archive.org/web/20110926095607/http://www.landwirtschaftskammer.de/landwirtschaft/ackerbau/boden/bodenbearbeitungsverfahren-pdf.pdf [Accessed: (23.10.2023)]
Landwirtschaftskammer Nordrhein-Westfalen (2015) Bodenbearbeitungssysteme. https://www.landwirtschaftskammer.de/landwirtschaft/ackerbau/boden/bodenbearbeitungssysteme-pdf.pdf [Accessed: (23.10.2023)]
Larmande P, Todorov K (2021) Agrold: A knowledge graph for the plant sciences. In: International Semantic Web Conference, Springer, pp 496–510
Lawan A, Rakib A, Alechina N, et al (2014) Advancing underutilized crops knowledge using swrl-enabled ontologies-a survey and early experiment. In: JIST (Workshops & Posters), pp 69–84
Leake A (2003) Integrated pest management for conservation agriculture. Conservation agriculture: environment, farmers experiences, innovations, socio-economy, policy, pp 271–279. https://doi.org/10.1007/978-94-017-1143-2_33
Leifeld J (2012) How sustainable is organic farming? Agric Ecosyst Environ 150:121–122. https://doi.org/10.1016/j.agee.2012.01.020
Liebisch F, Kirchgessner N, Schneider D et al (2015) Remote, aerial phenotyping of maize traits with a mobile multi-sensor approach. Plant Methods. https://doi.org/10.1186/s13007-015-0048-8
Lin TY, Maire M, Belongie S et al (2014) Microsoft coco: common objects in context. In: Fleet D, Pajdla T, Schiele B et al (eds) Computer Vision - ECCV 2014. Springer International Publishing, Cham, pp 740–755
Liu J, Xiang J, Jin Y et al (2021) Boost precision agriculture with unmanned aerial vehicle remote sensing and edge intelligence: a survey. Remote Sens. https://doi.org/10.3390/rs13214387
Liu M, Wang T, Skidmore AK et al (2018) Heavy metal-induced stress in rice crops detected using multi-temporal sentinel-2 satellite images. Sci Total Environ 637:18–29. https://doi.org/10.1016/j.scitotenv.2018.04.415
Lochner H, Breker J (2019) Agrarwirtschaft: Fachstufe Landwirt: Fachtheorie für pflanzliche Produktion, tierische Produktion und Energieproduktion. Eugen Ulmer KG
Lottes P, Behley J, Chebrolu N, et al (2018a) Joint Stem Detection and Crop-Weed Classification for Plant-Specific Treatment in Precision Farming. In: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp 8233–8238. https://doi.org/10.1109/IROS.2018.8593678
Lottes P, Behley J, Milioto A et al (2018) Fully convolutional networks with sequential information for robust crop and weed detection in precision farming. IEEE Robot Autom Lett 3(4):2870–2877. https://doi.org/10.1109/LRA.2018.2846289
Lytridis C, Kaburlasos VG, Pachidis T et al (2021) An overview of cooperative robotics in agriculture. Agronomy. https://doi.org/10.3390/agronomy11091818
Mancini M, Mazzoni L, Gagliardi F et al (2020) Application of the non-destructive NIR technique for the evaluation of strawberry fruits quality parameters. Foods 9(4):441. https://doi.org/10.3390/foods9040441
Matteis L, Chibon PY, Espinosa H, et al (2013) Crop ontology: Vocabulary for crop-related concepts. In: Proceedings of the First International Workshop on Semantics for Biodiversity (S4BioDiv)
McCool C, Sa I, Dayoub F, et al (2016) Visual detection of occluded crop: For automated harvesting. In: 2016 IEEE International Conference on Robotics and Automation (ICRA), pp 2506–2512. https://doi.org/10.1109/ICRA.2016.7487405
McCool C, Perez T, Upcroft B (2017) Mixtures of lightweight deep convolutional neural networks: applied to agricultural robotics. IEEE Robot Autom Lett 2(3):1344–1351. https://doi.org/10.1109/LRA.2017.2667039
McCool C, Beattie J, Firn J et al (2018) Efficacy of mechanical weeding tools: a study into alternative weed management strategies enabled by robotics. IEEE Robot Autom Lett. https://doi.org/10.1109/LRA.2018.2794619
Meltebrink C, Ströer T, Wegmann B et al (2021) Concept and realization of a novel test method using a dynamic test stand for detecting persons by sensor systems on autonomous agricultural robotics. Sensors 21(7):2315. https://doi.org/10.3390/s21072315
Meltebrink C, Strotdresch M, Wegmann B et al (2022) Humanoid test target for the validation of sensor systems on autonomous agricultural machines. Agric Eng. https://doi.org/10.15150/LT.2022.3282
Milioto A, Lottes P, Stachniss C (2018) Real-Time Semantic Segmentation of Crop and Weed for Precision Agriculture Robots Leveraging Background Knowledge in CNNs. In: 2018 IEEE International Conference on Robotics and Automation (ICRA), pp 2229–2235. https://doi.org/10.1109/ICRA.2018.8460962
Misra G, Cawkwell F, Wingler A (2020) Status of phenological research using sentinel-2 data: a review. Remote Sens 12(17):2760. https://doi.org/10.3390/rs12172760
Mitchell J, Carter L, Reicosky D et al (2016) A history of tillage in california’s central valley. Soil Tillage Res 157:52–64. https://doi.org/10.1016/j.still.2015.10.015
Mittler R (2006) Abiotic stress, the field environment and stress combination. Trends Plant Sci 11(1):15–19. https://doi.org/10.1016/j.tplants.2005.11.002
Mohanty SP, Hughes DP, Salathé M (2016) Using deep learning for image-based plant disease detection. Front Plant Sci. https://doi.org/10.3389/fpls.2016.01419
Mohimont L, Steffenel LA, Roesler M, et al (2022) Ai-driven yield estimation using an autonomous robot for data acquisition. In: Artificial Intelligence for Digitising Industry–Applications. River Publishers, p 279–288
Momeny M, Jahanbakhshi A, Neshat AA et al (2022) Detection of citrus black spot disease and ripeness level in orange fruit using learning-to-augment incorporated deep networks. Eco Inform 71(101):829. https://doi.org/10.1016/j.ecoinf.2022.101829
Mondelaers K, Aertsens J, Van Huylenbroeck G (2009) A meta-analysis of the differences in environmental impacts between organic and conventional farming. British food journal 111(10):1098–1119. https://doi.org/10.1108/00070700910992925
Navarro-Hellín H, Martinez-del Rincon J, Domingo-Miguel R et al (2016) A decision support system for managing irrigation in agriculture. Comput Electron Agric 124:121–131
Navas E, Fernández R, Sepúlveda D et al (2021) Soft grippers for automatic crop harvesting: a review. Sensors. https://doi.org/10.3390/s21082689
Neethirajan S (2023) The significance and ethics of digital livestock farming. AgriEngineering 5(1):488–505. https://doi.org/10.3390/agriengineering5010032
Ngo QH, Kechadi T, Le-Khac NA (2022) Knowledge representation in digital agriculture: a step towards standardised model. Comput Electron Agric 199(107):127
Ngugi L, Abelwahab M, Abo-Zahhad M (2021) Recent advances in image processing techniques for automated leaf pest and disease recognition - a review. Inform Process Agric. https://doi.org/10.1016/j.inpa.2020.04.004
Noyan MA (2022). Uncovering bias in the plantvillage dataset. https://doi.org/10.48550/arXiv.2206.04374
OECD (2019) Scoping the OECD AI principles. OECD Digital Economy Papers. https://doi.org/10.1787/d62f618a-en
Onishi Y, Yoshida T, Kurita H et al (2019) An automated fruit harvesting robot by using deep learning. ROBOMECH J 6(1):13. https://doi.org/10.1186/s40648-019-0141-2
Pallottino F, Antonucci F, Costa C et al (2019) Optoelectronic proximal sensing vehicle-mounted technologies in precision agriculture: A review. Comput Electron Agric 162:859–873. https://doi.org/10.1016/j.compag.2019.05.034
Pan Y, Magistri F, Läbe T et al (2023). Panoptic mapping with fruit completion and pose estimation for horticultural robots. https://doi.org/10.48550/arXiv.2303.08923
Partel V, Kakarla SC, Ampatzidis Y (2019) Development and evaluation of a low-cost and smart technology for precision weed management utilizing artificial intelligence. Comput Electron Agric 157:339–350. https://doi.org/10.1016/j.compag.2018.12.048
Peigné J, Ball B, Roger-Estrade J et al (2007) Is conservation tillage suitable for organic farming? a review. Soil Use Manag 23(2):129–144. https://doi.org/10.1111/j.1475-2743.2006.00082.x
Peña JM, Torres-Sánchez J, de Castro AI et al (2013) Weed mapping in early-season maize fields using object-based analysis of unmanned aerial vehicle (UAV) images. PLOS One. https://doi.org/10.1371/journal.pone.0077151
Peynot T, Scheding S, Terho S (2010) The Marulan data sets: multi-sensor perception in natural environment with challenging conditions. Int J Robot Res 29(13):1602–1607. https://doi.org/10.1177/0278364910384638
Pezzementi Z, Tabor T, Hu P et al (2018) Comparing apples and oranges: off-road pedestrian detection on the national robotics engineering center agricultural person-detection dataset. J Field Robot 35(4):545–563. https://doi.org/10.1002/rob.21760
Polic M, Tabak J, Orsag M (2022) Pepper to fall: a perception method for sweet pepper robotic harvesting. Intel Serv Robot 15(2):193–201. https://doi.org/10.1007/s11370-021-00401-7
Polvara R, Molina S, Hroob I et al (2023) Bacchus long-term (blt) data set: acquisition of the agricultural multimodal blt data set with automated robot deployment. J Field Robot. https://doi.org/10.1002/rob.22228
Pour MK, Fotouhi R, Hucl P et al (2021) Development of a mobile platform for field-based high-throughput wheat phenotyping. Remote Sens. https://doi.org/10.3390/rs13081560
Pretto A, Aravecchia S, Burgard W et al (2021) Building an aerial-ground robotics system for precision farming: an adaptable solution. IEEE Robot Autom Mag. https://doi.org/10.1109/MRA.2020.3012492
Ramin Shamshiri R, Weltzien C, Hameed I et al (2018) Research and development in agricultural robotics: a perspective of digital farming. Int J Agric Biol Eng. https://doi.org/10.25165/j.ijabe.20181104.4278
Recio B, Rubio F, Criado JA (2003) A decision support system for farm planning using agrisupport ii. Decis Support Syst 36(2):189–203
Reganold JP, Wachter JM (2016) Organic agriculture in the twenty-first century. Nat Plants 2(2):1–8. https://doi.org/10.1038/nplants.2015.221
Reganold JP, Elliott LF, Unger YL (1987) Long-term effects of organic and conventional farming on soil erosion. Nature 330(6146):370–372. https://doi.org/10.1038/330370a0
Reina G, Milella A, Rouveure R et al (2016) Ambient awareness for agricultural robotic vehicles. Biosys Eng 146:114–132. https://doi.org/10.1016/j.biosystemseng.2015.12.010
Research Institute of Organic Agriculture FibL (2023) Organic area share of total farmland. https://statistics.fibl.org [Accessed: (30.10.2023)]
Roshanianfard A, Noguchi N (2020) Pumpkin harvesting robotic end-effector. Comput Electron Agric 174(105):503. https://doi.org/10.1016/j.compag.2020.105503
Roussey C, Chanet JP, Cellier V, et al (2013) Agronomic taxon. In: Proceedings of the 2nd International Workshop on Open Data, pp 1–4
Russakovsky O, Deng J, Su H et al (2015) ImageNet Large Scale Visual Recognition Challenge. Int J Comput Vis (IJCV) 115(3):211–252. https://doi.org/10.1007/s11263-015-0816-y
Sa I, Chen Z, Popović M et al (2018) weedNet: dense semantic weed classification using multispectral images and MAV for smart farming. IEEE Robot Autom Lett 3(1):588–595. https://doi.org/10.1109/LRA.2017.2774979
Saiz-Rubio V, Rovira-Más F (2020) From smart farming towards agriculture 5.0: a review on crop data management. Agronomy 10(2):207. https://doi.org/10.3390/agronomy10020207
Sajith G, Srinivas R, Golberg A et al (2022) Bio-inspired and artificial intelligence enabled hydro-economic model for diversified agricultural management. Agric Water Manag 269(107):638
Saleem M, Potgieter J, Arif K (2021) Automation in agriculture by machine and deep learning techniques: a review of recent developments. Precision Agric. https://doi.org/10.1007/s11119-021-09806-x
Sankaran S, Khot LR, Espinoza CZ et al (2015) Low-altitude, high-resolution aerial imaging systems for row and field crop phenotyping: a review. Eur J Agron 70:112–123. https://doi.org/10.1016/j.eja.2015.07.004
Schütze N, Schmitz GH (2010) Occasion: new planning tool for optimal climate change adaption strategies in irrigation. J Irrig Drain Eng 136(12):836–846
Segarra J, Buchaillot ML, Araus JL et al (2020) Remote sensing for precision agriculture: sentinel-2 improved features and applications. Agronomy 10(5):641. https://doi.org/10.3390/agronomy10050641
Seifi A, Ehteram M, Singh VP et al (2020) Modeling and uncertainty analysis of groundwater level using six evolutionary optimization algorithms hybridizedwith ANFIS, SVM, and ANN. Sustainability 12(10):4023. https://doi.org/10.3390/su12104023
Shankar P, Werner N, Selinger S, et al (2020) Artificial intelligence driven crop protection optimization for sustainable agriculture. In: 2020 IEEE/ITU International Conference on Artificial Intelligence for Good (AI4G), IEEE, pp 1–6
Sharma A, Jain A, Gupta P et al (2021) Machine learning applications for precision agriculture: a comprehensive review. IEEE Access. https://doi.org/10.1109/ACCESS.2020.3048415
Shi Y, Thomasson JA, Murray SC et al (2016) Unmanned aerial vehicles for high-throughput phenotyping and agronomic research. PLOS ONE. https://doi.org/10.1371/journal.pone.0159781
Shin J, Chang YK, Heung B et al (2021) A deep learning approach for RGB image-based powdery mildew disease detection on strawberry leaves. Comput Electron Agric 183(106):042. https://doi.org/10.1016/j.compag.2021.106042
Silwal A, Davidson JR, Karkee M et al (2017) Design, integration, and field evaluation of a robotic apple harvester. J Field Robot. https://doi.org/10.1002/rob.21715
Singh D, Jain N, Jain P, et al (2020) Plantdoc: A dataset for visual plant disease detection. In: Proceedings of the 7th ACM IKDD CoDS and 25th COMAD. Association for Computing Machinery, New York, NY, USA, CoDS COMAD 2020, p 249–253. https://doi.org/10.1145/3371158.3371196
Sishodia RP, Ray RL, Singh SK (2020) Applications of remote sensing in precision agriculture: a review. Remote Sens 12(19):3136. https://doi.org/10.3390/rs12193136
Söderkvist O (2001) Computer vision classification of leaves from swedish trees
Sood A, Sharma RK, Bhardwaj AK (2022) Artificial intelligence research in agriculture: a review. Online Information Review
Sozzi M, Cantalamessa S, Cogato A et al (2022) Automatic bunch detection in white grape varieties using YOLOv3, YOLOv4, and YOLOv5 deep learning algorithms. Agronomy 12(2):319. https://doi.org/10.3390/agronomy12020319
Sparrow R, Howard M (2021) Robots in agriculture: prospects, impacts, ethics, and policy. Precision Agric. https://doi.org/10.1007/s11119-020-09757-9
Spykman O, Gabriel A, Ptacek M et al (2021) Farmers’ perspectives on field crop robots - evidence from bavaria, germany. Comput Electron Agric. https://doi.org/10.1016/j.compag.2021.106176
Stenberg JA (2017) A conceptual framework for integrated pest management. Trends Plant Sci 22(9):759–769. https://doi.org/10.1016/j.tplants.2017.06.010
Su J, Coombes M, Liu C, et al (2018) Wheat drought assessment by remote sensing imagery using unmanned aerial vehicle. In: 2018 37th Chinese Control Conference (CCC), IEEE, pp 10340–10344. https://doi.org/10.23919/ChiCC.2018.8484005
Syed-Ab-Rahman SF, Hesamian MH, Prasad M (2022) Citrus disease detection and classification using end-to-end anchor-based deep learning model. Appl Intell 52(1):927–938. https://doi.org/10.1007/s10489-021-02452-w
Tataridas A, Kanatas P, Chatzigeorgiou A et al (2022) Sustainable crop and weed management in the era of the EU green deal: a survival guide. Agronomy. https://doi.org/10.3390/agronomy12030589
Tiedemann T, Cordes F, Keppner M et al (2022) Challenges of autonomous in-field fruit harvesting and concept of a robotic solution. In: Proceedings of the 19th International Conference on Informatics in Control, Automation and Robotics. SCITEPRESS - Science and Technology Publications, pp 508–515. https://doi.org/10.5220/0011321300003271
Trevini M, Benincasa P, Guiducci M (2013) Strip tillage effect on seedbed tilth and maize production in northern italy as case-study for the southern europe environment. Eur J Agron 48:50–56. https://doi.org/10.1016/j.eja.2013.02.007
Underwood J, Wendel A, Schofield B et al (2017) Efficient in-field plant phenomics for row-crops with an autonomous ground vehicle. J Field Robot 34(6):1061–1083. https://doi.org/10.1002/rob.21728
Utstumo T, Urdal F, Brevik A et al (2018) Robotic in-row weed control in vegetables. Comput Electron Agric 154:36–45. https://doi.org/10.1016/j.compag.2018.08.043
Van Klompenburg T, Kassahun A, Catal C (2020) Crop yield prediction using machine learning: a systematic literature review. Comput Electron Agric 177(105):709. https://doi.org/10.1016/j.compag.2020.105709
Wang Q, Nuske S, Bergerman M, et al (2013) Automated crop yield estimation for apple orchards. In: Experimental Robotics: The 13th International Symposium on Experimental Robotics, Springer, pp 745–758. https://doi.org/10.1007/978-3-319-00065-7_50
Wang S, Jiang H, Qiao Y et al (2022) The research progress of vision-based artificial intelligence in smart pig farming. Sensors. https://doi.org/10.3390/s22176541
Wang T, Xu X, Wang C et al (2021) From smart farming towards unmanned farms: a new mode of agricultural production. Agriculture. https://doi.org/10.3390/agriculture11020145
Wegener JK (2021) Entwicklungen im Bereich der Anwendungstechnik im Pflanzenschutz gestern, heute und morgen. J Kulturpflanzen Seiten. https://doi.org/10.5073/JFK.2021.07-08.12
Wenkel KO, Berg M, Mirschel W et al (2013) Landcare dss-an interactive decision support system for climate change impact assessment and the analysis of potential agricultural land use adaptation strategies. J Environ Manage 127:S168–S183
White JW, Conley MM (2013) A flexible, low-Cost cart for proximal sensing. Crop Sci 53(4):1646–1649. https://doi.org/10.2135/cropsci2013.01.0054
Wigness M, Eum S, Rogers JG, et al (2019) A RUGD dataset for autonomous navigation and visual perception in unstructured outdoor environments. In: 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp 5000–5007. https://doi.org/10.1109/IROS40897.2019.8968283
Windisch W (2022) Brauchen wir in zukunft überhaupt noch nutztiere? In: Nutztierschutztagung 2022, pp 13–20. https://raumberg-gumpenstein.at/jdownloads/Tagungen/Nutztierschutztagung/Nutztierschutztagung_2022/3n_2022_Gesamt.pdf#page=14. Accessed 30 Oct 2023
Winterhalter W, Fleckenstein FV, Dornhege C et al (2018) Crop row detection on tiny plants with the pattern hough transform. IEEE Robot Autom Lett 3(4):3394–3401. https://doi.org/10.1109/LRA.2018.2852841
Wolf P, Berns K (2021) Data-fusion for robust off-road perception considering data quality of uncertain sensors. In: 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, Prague, Czech Republic, pp 6876–6883. https://doi.org/10.1109/IROS51168.2021.9636541
Wu SG, Bao FS, Xu EY et al (2007). A leaf recognition algorithm for plant classification using probabilistic neural network. https://doi.org/10.1109/ISSPIT.2007.4458016
Wu X, Aravecchia S, Pradalier C (2019) Design and Implementation of Computer Vision based In-Row Weeding System. In: 2019 International Conference on Robotics and Automation (ICRA), pp 4218–4224. https://doi.org/10.1109/ICRA.2019.8793974
Xiao JR, Chung PC, Wu HY et al (2021) Detection of strawberry diseases using a convolutional neural network. Plants 10(1):31. https://doi.org/10.3390/plants10010031
Xie D, Chen L, Liu L et al (2022) Actuators and sensors for application in agricultural robots: a review. Machines. https://doi.org/10.3390/machines10100913
Xiong Y, Ge Y, Grimstad L et al (2020) An autonomous strawberry-harvesting robot: design, development, integration, and field evaluation. J Field Robot 37(2):202–224. https://doi.org/10.1002/rob.21889
Yuan S, Wang H, Xie L (2021) Survey on localization systems and algorithms for unmanned systems. Unmanned Syst. https://doi.org/10.1142/S230138502150014X
Zhai Z, Martínez JF, Beltran V et al (2020) Decision support systems for agriculture 4.0: survey and challenges. Comput Electron Agric 170(105):256. https://doi.org/10.1016/j.compag.2020.105256
Zheng Q, Huang W, Cui X et al (2018) New spectral index for detecting wheat yellow rust using sentinel-2 multispectral imagery. Sensors 18(3):868. https://doi.org/10.3390/s18030868
Zou Z, Chen K, Shi Z et al (2023) Object detection in 20 years: a survey. Proc IEEE 111(3):257–276. https://doi.org/10.1109/JPROC.2023.3238524
Acknowledgements
The DFKI Niedersachsen is supported by the Ministry of Science and Culture of Lower Saxony from funds of the program zukunft.niedersachsen of the Volkswagen Foundation.
Funding
Open Access funding enabled and organized by Projekt DEAL.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Kisliuk, B., Krause, J.C., Meemken, H. et al. AI in Current and Future Agriculture: An Introductory Overview. Künstl Intell 37, 117–132 (2023). https://doi.org/10.1007/s13218-023-00826-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13218-023-00826-5