Abstract
The chapter aims to provide guidance on how phenotyping may contribute to the genetic advance of wheat in terms of yield potential and resilience to adverse conditions. Emphasis will be given to field high throughput phenotyping, including affordable solutions, together with the need for environmental and spatial characterization. Different remote sensing techniques and platforms are presented, while concerning lab techniques only a well proven trait, such as carbon isotope composition, is included. Finally, data integration and its implementation in practice is discussed. In that sense and considering the physiological determinants of wheat yield that are amenable for indirect selection, we highlight stomatal conductance and stay green as key observations. This choice of traits and phenotyping techniques is based on results from a large set of retrospective and other physiological studies that have proven the value of these traits together with the highlighted phenotypical approaches.
You have full access to this open access chapter, Download chapter PDF
Similar content being viewed by others
Keywords
1 Learning Objectives
-
Understanding how phenotyping may contribute to wheat genetic advance and potential techniques to apply.
2 Introduction
Phenotyping is nowadays considered a major bottleneck limiting the breeding efforts [1]. In fact, high throughput precision phenotyping is becoming more accepted as viable way to capitalize on recent developments in crop genomics (see Chaps. 28 and 29) and prediction models (see Chap. 31). However, for many breeders, the adoption of new phenotyping traits and methodologies only makes sense if they provide added value relative to current phenotyping practices. In that sense, a basic concern for many breeders is still the controlled nature of many of the phenotyping platforms developed in recent years and the perception that most of these platforms are unable to fully replicate environmental variables influencing complex traits at the scale of climate variability nor handle the elevated numbers of phenotypes required by breeding programs [2]. This does not exclude for example the interest of indoors (i.e., fully controlled) platforms for specific studies or traits to be evaluated, or even the need to developing special outdoor (i.e., near field) but still controlled facilities. This is the case of phenotyping arrangements aimed to evaluate resilience to particular stressors (e.g., diseases, pests, waterlogging…) or the performance of hidden plant parts (i.e., roots) or non-laminar photosynthetic organs (e.g., ears, culms). While this chapter will focus on the general aspects concerning wheat phenotyping, specific information about special setups is very abundant.
Phenotyping of simple traits (e.g., plant height) can be achieved even by untrained personnel within a manageable time frame. However, manual phenotyping of complex traits, which is often the case when focusing on drought or heat tolerance, requires experienced professionals and is time intensive. Another important point to consider is that phenotyping of the large genotype sets is generally only feasible if conducted by several persons. Moreover, in case phenotyping is conducted visually this results in an inflation of measuring error, which might be further increased by fatigue setting, and is prone to subjective appreciation of each person. A recent paper [3] has defined the high throughput phenotyping as “relatively new for most breeders and requiring significantly greater investment with technical hurdles for implementation and a steeper learning curve than the minimum data set,” where visual assessments are often the preferred choice.
In what follows, this chapter will address crop phenotyping within the context of its implementation under real growing (i.e., field) conditions. Literature and examples included will refer as much as possible to wheat or other small grain cereals under field conditions. In that sense we will introduce the term high throughput field phenotyping (HTFP).
The aim of an efficient phenotyping method is to enhance genetic gain (Fig. 27.1), which is defined as the amount of increase in performance achieved per unit time through artificial selection (see Chap. 7), usually referred to the increase after one generation (or cycle) has passed. Continuing on, the potential contribution of phenotyping to wheat breeding is placed in context by taking the genetic-advance determinants as a framework of reference. Alternative ways to dissect the role of phenotypic on genetic gain have been assessed elsewhere [4].
Accelerating genetic gain can be achieved by increasing selection intensity, accuracy and genetic variation, and/or reducing cycle time (see also Chap. 30). Phenotyping contributes both directly and indirectly to these variables [5]. Direct effects include increasing selection intensity by the development and deployment of more high throughput phenotyping techniques, evaluating larger populations eventually across different environments, which is actually the main purpose of this chapter, improved selection accuracy, which involves the repeatability and precision of the phenotyping techniques deployed, and identifying new genetic variability for the targeted traits, which, provided that it exists [6], may be secured through preselection, using very high throughput affordable approaches, even if they are not as accurate [4].
Indirect positive effects are diverse but also relevant. Low-cost phenotyping protocols allow breeders to increase selection intensity and identify new genetic variability, for example through the evaluation of larger populations. Phenotyping means more than just selecting the right traits and choosing the appropriate tools for evaluation, together with efficient data management. It also requires appropriate trial management and spatial variability handling [1, 5]. Improved trial management and field variation control will increase the selection accuracy -of phenotyping and thus the heritability of the trait being selected (see Chaps. 5, 6, 7 and 12). Therefore, selection accuracy is also improved through the deploying of phenotyping techniques to account for the growing conditions where plants are phenotyped (spatial variability in environmental factors, which also may involve the use of phenotyping techniques). While phenotyping does not directly contribute towards decrease cycle time, it is likely to play a more important role indirectly. For example, targeted HTFP will permit the reliable phenotyping of greater numbers of genetic resources derived from breeding lines by using smaller plot sizes and assessments obtained at earlier stages of population development. This allows breeders to reduce the duration of breeding cycles and the loss of potentially important alleles with linkage drag [4], therefore contributing to increasing the genetic gain (see Chap. 7). Moreover, while most efforts are considered toward direct selection for yield, indirect selection for physiological, morphological or biochemical yield-component traits can provide the opportunity to introduce new alleles from which genetic progress can be made [4].
Phenotypic expression is the response of genotypes to varied environmental conditions (GxE) or even to the agronomical management practices (GxExM), and therefore the full disentangling of the link between plant phenotype and its genetic background cannot be achieved (see Chap. 15) without considering the full and accurate quantification of the environmental and agronomical conditions experienced during growth [5]. Therefore, appropriate documentation of the environmental growth conditions is essential for any crop phenomics strategy. This implies a systematic collection and integration of meteorological data at different spatio-temporal scales, frequently using low-cost sensors [7]. Finally, new avenues for data management and exploitation are required in order to optimally capitalize on recent improvements in data capture and computation capacity.
Summarizing, the objective of practical phenotyping innovation is the implementation of high throughput precision phenotyping under real (i.e., field in most cases) conditions and preferably at an affordable cost. On the other hand, proper HTFP requires some basic uniform characteristics such as similar phenology of the whole set of varieties selected as well as the identification of the right growth stage (or stages) when phenotyping has to be conducted. In other words, a phenotypic trait may have a positive, negative or no relationship with grain yield or another target parameter depending on the growth stage at measurement. Such differential performance of a phenotypic trait may depend on different factors such as the phenological stage when it is measured or the growing conditions. The phenotypical performance may even be biased if the targeted germplasm is too diverse in terms of phenology (e.g., heading, anthesis or maturity dates). Therefore, in addition to choosing the optimal phenotypic traits, the time at which they are assessed, while avoiding too wide of a genotypic range in phenology, is also crucial. This applies for remote sensing traits such as vegetation indices as well as for lab traits such as the carbon isotope composition [6, 8].
3 Platforms: From Ground to the Sky
The drawbacks of most time-consuming phenotyping methods in terms of throughput and standardization can be overcome using image-based data collection. Remote sensing technologies, with the respective controllers and data loggers that complement the imaging systems, are usually assembled into what are termed as phenotyping platforms [5, 7, 9, 10]. The use of these platforms allows for a more efficient and accurate phenotyping with stable error across all genotypes, whether as single plants or in micro-plots. However, currently many of these platforms are costly and/or not applicable on a wide scale. Therefore, there is a strongly expressed need by the crop breeding community to develop both state-of-the-art and cost-effective, easy to use, and nonstationary HTFP platforms. These platforms may also represent tailored solutions to specific cases or a feasible formula on how to apply standard phenotyping tools in breeding programs with limited resources [11].
The concept of the phenotypic platform is wide and embraces a varied range of options in terms of placement: ground, aerial or even eventually (in the coming years) at the space level (Fig. 27.2; [10]). Within the category of ground phenotyping, platforms have quickly diversified, and the range of options is very wide: from a simple hand-held sensor, including for example monopods and tripods carrying any sensors from a simple yet effective RGB color camera, to complex unmanned ground vehicles of diverse nature, which are generically termed as “phenomobiles,” and include tractor-mounted sensors, other tailored solutions (e.g., carts, buggies) or mobile cranes. Within the ground category one may also include highly complex stationary facilities. Cable-based robotics systems are becoming also an alternative for outdoor (i.e. field) phenotyping, which allow imaging platforms to move about a defined area [12]. Within the hand-held category of platforms, smartphones are becoming an alternative giving they may carry out different imagers (e.g. RGB and thermal), data management activities and geo-referencing functions [5, 7, 9, 10].
Aerial platforms of different nature are being widely used, particularly more and more involving unmanned aerial vehicles (UAV), popularly known as drones. Proximal and remote sensing sensors are now able to be mounted on low flying multirotor UAVs, with image acquisition capabilities at spatial scales in centimeters, relevant to crop breeding [13]. The use of drones has popularized in the recent years [10] and even book manuals (even if mostly focused on crop management) have been produced. The remote sensing tools most frequently deployed in phenotyping platforms are RGB cameras, alongside multispectral and thermal sensors or imagers [14]. The increasing availability of compact drones which don’t need to be assembled, bring the sensors embedded, and are affordable, reliable and easy to control, is popularizing more and more this option (Fig. 27.3). Nevertheless, other unmanned options offer appealing alternatives with contrasting capabilities, particularly fixed wing UAVs, where for example, the crop area to monitor area larger than a few hectares, or in related precision agriculture activities [15]. Other alternatives such as manned aircrafts are less used by crop breeders given the cost of this alternative, while the use of satellites on phenotyping are not yet a reality in practical terms due to the lack of free sub-meter resolution data, but they will surely be of increasing interest in the near future as these technologies advance [5, 10].
Here we outline standards for the deploying simple stationary, cable-based robotics and ultimately UAVs as progressively more mobile and high throughput phenotyping platforms for the transport of the various proximal and remote sensing sensors/imagers. The primary selection criterion concerning the equipment to carry out the HTFP platforms concerns the choice of the most adequate sensors for the estimation of the specific biophysical traits of interest at an appropriate technology readiness level. Many of the simultaneous major technological advancements in HTFP platforms come from the impressive miniaturization of imaging and measurement technologies and on-board processing capacities, but perhaps even more importantly massive leaps and bounds in communications, compact and lightweight batteries, inertial sensors, electronic compasses, data storage and intelligent automated control algorithms. These have come together to enable the development of more compact and light weight scientific imaging sensors and at the same time improved indoor robotics systems and UAVs with increasing autonomy, carrying capacity, stability, and security. The result is that scientific quality of remote sensing platforms and sensors that only 5 years ago were nearly exclusively limited to very expensive indoor installations and manned airborne platforms (only able to provide ground spatial resolution a.k.a. pixel sizes on the order of 5–20 m and not amenable to phenotyping) are now available in more cost-effective unmanned systems. In fact, UAVs are nowadays the most popular mobile platform for phenotyping purposes [16].
4 Phenotyping Is More than Just Monitoring Techniques
Crop phenotyping is about collecting useful and meaningful data for integration into crop breeding programs. As such, a complete HTFP platform research protocol should include considerations for every part of the full process in order to ensure that no bottlenecks impede the throughput of the phenotyping activities. This includes but is not limited to (1) the equipment (sensors, platforms and software); (2) use operation (e.g. pilot permits and training, flight plans, and image acquisition in case of UAV); (3) proper storing and managing of experimental datasets for long term use; (4) image processing (pre-processing, calibration, mosaicking); (5) data generation (extraction from processed image to plot level data); (6) data analysis (index calculations, stats scripts) and database structure (storage, linkages, inventory indexing, ontologies, etc.); (7) specific case studies of bottlenecks to throughput, training requirements, costs, and optimization for specific crops (scalable/transferable traits) [7]. All the major components from sensors to platforms to software for each key processing step are intricately intertwined and need to be considered together such that pre-integrated systems or close attention to integration details will improve both data quality and data throughput. Some examples have been provided for the deployment of RGB images.
5 Data Integration: From Ideotype to Modelling and More
Connecting genomic and phenomic datasets remains challenging. Is in this context where plant phenotyping is creating new needs for data standardization, analysis and storage [17]. In addition, the value of phenotypic data is moving from empirical/descriptive context, where ideotype, understood as the fixed combination of traits that confers advantage to a given wheat genotype was the target, to the use of phenotypic data in a more mechanistic way, through simulation models aiming to predict genotype performance (see Chaps. 31 and 32). In that sense integration of phenotypic data into simulation models to predict trait value is of increasing importance [18]. Besides that, large amounts of phenotypic data are used in a statistically oriented manner, for marker-assisted and even more for genomic selection (see Chaps. 28 and 29). The future of crop breeding lies in the standardization of data collection across phenotyping platforms.
On the other hand, the development of specific software tools that meet the needs of the crop phenotyping community in terms of remote sensing data processing, extraction and analysis have been identified as potentially the greatest bottleneck for generating high quality phenotypic data [19]. This includes for example the development of intuitive, easy-to-use semiautomatic programs for microplot extraction encompassing also appropriate flight planning to capture images with sufficient quality, which implies relevant concepts such as view, sharpness and exposure calculations, in addition to consider ground control points (GCPs), viewing geometry and way-point flights [20]. These new software tools will need be integrated to include not only the assessment of crop growth performance (including for example crop establishment, stay green) and grain yield, but also the detection and quantification of phenological stages (heading or maturity times and even anthesis), agronomical yield components (ear density), total biomass, or identifying specific pests and diseases and further quantifying its impact.
Overall, there is a great need for new analytical approaches that can integrate multiple types of data or provide proper experimental design in observational contexts. This need will only grow with the development of imaging, sequencing, and sensing technologies. A recent push in this direction has been an emphasis on machine learning and artificial intelligence in phenotyping [21]. Concerning trait measurements, implementing machine learning methods on UAV data enhances the capability of data processing and prediction in various applications [16], such as wheat ear counting [22]. High spatial resolution UAV-based remote sensing imagery with a resolution between 0 and 10 cm is the most frequently employed data source amongst those utilized for machine learning approaches [16]. Classification and regression are two main prediction problems that are commonly used in UAV-based applications. Taking RGB images as a proximal remote sensing approach may increase the resolution of images and therefore the usefulness of these images when analyzed with machine learning methods. Thus, for example, using an RGB camera placed on a pole at 1.2 m from the ground provided a ground spatial resolution better than 0.2 mm, able to assess the thickness of the residual stems standing straight after the cutting by the combine machine during harvest. In that case, a faster Regional Convolutional Neural Network (Faster-RCNN) deep-learning model was first trained to identify the stems cross section [23]. Machine learning algorithms can be implemented using either open source or commercial software. Open source coding environments such as Python and R are freely available and may be redistributed and modified.
6 Affordable Phenotyping Approaches
Many of the desired phenotypic traits can be acquired using cost-effective and readily available RGB cameras, which are characterized as very high spatial resolution imaging sensors, with quality color calibration and PAR spectral coverage (Fig. 27.3). These are extensively addressed elsewhere (e.g. [7]). In short, several RGB vegetation indexes use the spectral concept for the estimation of biomass and canopy chlorophyll, while others are based on alternate color space transforms such as Hue Saturation Intensity (HSI), CIE-LAB and CIE-LUV [24]. Practical solutions exist for the calculation of these RGB vegetation indexes using free, open-source software. Thus, for example, our team at the University of Barcelona has developed open-source software tools for analyzing high resolution RGB digital images, with special consideration to cost-effectiveness, technology availability and computing capacity using digital cameras or smartphones for data acquisition. Besides the formulation of vegetation indices amenable to monitor crop growth, stay green, or quantify the impact of a given pest or disease which affect the green biomass, examples exist on the use of RGB images to specific purposes such as for example assessing ear density [22]. Recently methods have been proposed to phenotype early development of wheat, specifically to assess the rate of plant emergence, the number of tillers, and the beginning of stem elongation using drone-based RGB imagery. Moreover, the characteristics of the digital RGB images, together with the support of machine learning approaches, make feasible the automatic identification of plant deficiencies and biotic stressed based in the shape and pattern of leaf symptoms such as chlorosis, necrosis spots etc.
Besides the RGB sensors, in the last years a wide range of affordable multispectral imagers, and even thermal imagers, and dual multispectral/RGB or thermal/RGM imagers are available, making HTFP more feasible to, for example, small seed companies and national agricultural research organizations.
The main traits that can be measured in the field using affordable HTP-approaches is included, with the sensors/indices, as well as a qualitative assessment of their precision, in Table 27.1.
7 Hyperspectral Imaging for Crop Phenotyping: Pros and Cons
Hyperspectral sensors and cameras are among the most promising for the phenotyping of advanced traits. The application of hyperspectral reflectance to proximal (i.e. ground level) plant phenotyping at high resolution range makes it possible to infer, in the case of wheat under field conditions, not only grain yield but for example the content of metabolites in leaves and ears [28], or photosynthetic capacities and quenching. Hyperspectral imaging techniques have been expanding considerably in recent years. The cost of current solutions is decreasing, but these high-end technologies are not yet available for moderate to low-cost outdoor or indoor applications. However new methodological developments, such as a single-pixel imaging setup [29], which do not require (as much of an investment) high computational capacity, may offer a more approachable alternative.
In spite of the recent availability of hyperspectral UAV sensors, both the sensor design and the resulting data result in several complications at the time of capture, pre-processing, calibration, and analysis [13]. Firstly, “hyper” literally means “too much” so hyperspectral sensors are and openly acknowledged as frequently capturing more data than is necessary for any specific given purpose. For that reason, they are and will continue to be considered as more exploratory and experimental rather than operational sensors. It is on the scientific community to take on the challenge of first acquiring what may be considered as excess data in order to later distil the “big data” down to the essential and prescribe the more specific and required measurements for any particular measurement goal, in this case the phenotyping of photosynthesis and biophysical traits relevant to the disentangling of genetic sequencing data and maximizing yield to feed the future [30].
Moreover, the use of hyperspectral images from moving platforms, such as those carried out from UAV, has additional challenges [13]. Unlike sensors that capture whole images in one instant like RGB (which captures three separate spectral regions in one image with its integrated Bayer filter) and the more common multispectral cameras (which capture each spectral region with a different sensor and are later corrected for parallax) most hyperspectral cameras are not “area array” type. Most hyperspectral imagers are of the “line scanning” type, which require a moving mirror and spectral prism to iteratively measure each wavelength over a single line of pixels as the UAV moves forward. This requires carefully programmed and timed internal sensor movements with the external robotics platform or UAV flights at specific forward movement velocities relative to the distance between the sensor and crop. The data is also thus more likely to be adversely affected by environmental conditions and gimbal instability. In turn, the carrying platform and hyperspectral camera system must be fully integrated as the inertial measurement unit (IMU) accelerometer (yaw, pitch and roll) and positioning, whether in local or GPS (geographical location) data in order to create a correct hyperspectral image. Ground topographical variability, if present, should also be optimally corrected for using a separately produced digital elevation model (DEM).
Still, despite these complications, adequately integrated hyperspectral sensors and platforms from stationary solutions to UAVs are available and may provide excellent data with in-depth knowledge and expertise in data interpretation and processing. More common UAV multispectral sensors are based precisely on the extensive data analysis from field spectroscopy and airborne hyperspectral imaging conducted by research laboratories over the past 40 years [31]. The best bands for measuring specific plant spectral properties that are associated with physiological traits of interest have been selected with regards to both their specific central wavelength and their bandwidth (range of wavelengths where radiation is measured) and designed accordingly. However, no full VNIR+SWIR hyperspectral sensors have been available for application as HTFPs with the specific purpose of crop breeding until very recently, due to the many technological barriers that impeded their deployment on UAVs, and as such the linkage between spectral wavelengths and breeding traits has not been completed.
8 Implementing Phenotyping in Practice
Some approaches for practical wheat phenotyping will be briefly presented taking grain yield as the breeding target (Fig. 27.4). A thorough set of examples of traits and conditions where phenotyping may be applied in practice at different levels (handy, high throughput, and precision phenotyping) may be accessed elsewhere [3].
Identifying the key traits for phenotyping may result in convergent approaches. On one hand, grain yield may be dissected into three main physiological components: the amount of resources (radiation, water, nitrogen…) captured by the crop, the efficient use of these resources and the dry matter partitioning (so called harvest index). The kind of resource considered depend in each case on what is the limiting factor (e.g. under drought conditions or low nitrogen fertility, water and nitrogen will be the relevant resources, respectively). On the other hand, retrospective studies, comparing cultivars developed through the last decades, also provide clues on the most successful, to date, physiological traits, involved in the genetic advance after Green Revolution. For this approach it is important to avoid confounding effects associated with the inclusion in the comparison, genotypes developed prior Green Revolution or even transitional ones. In that sense genetic advance in wheat for a wide range of environmental conditions has been associated with a higher stomatal conductance [32]. Remote sensing techniques such as infrared thermometry or thermography may be deployed as proxies for higher transpiration [1, 4, 5, 9]. An alternative is to use the stable carbon isotope, one of the few lab-phenotyping traits widely accepted. Usually a lower (i.e. more negative) carbon isotope composition (δ13C) or, alternatively, a higher carbon isotope discrimination (Δ13C), particularly when analyzed in mature kernels and confounding effects are avoided (such as differences in phenology [6]) is pursued, since it indicates a better water status and eventually more water captured by the crop, in spite the fact water use efficiency decreases. Another trait to consider is stay green, which may be relevant particularly under good agronomical conditions [33]. This trait may be assessed through multispectral of RGB-derived vegetation indices assessed during grain filling [24, 34]. The same category of indices may be used to assess early vigor and ground covering. The three categories of main remote sensing approaches (RGB, thermal or multispectral/hyperspectral) may be used to assess differences in phenology, particularly heading and anthesis nature.
Digital RGB imaging may allow to 3D surface reconstruction to provide estimations of plant height, and incidence of lodging, while image-pattern recognition may help to identify the presence of a pest or disease, which may be further quantified on their impact by RGB or multispectral vegetation indices [26].
Greater biomass is also considered as a key target trait for selection, particularly since harvest index is reaching theoretical maximum, while the increasing in biomass have been minor during the more the half century elapsed from the beginning of the Green Revolution to the present. HTFP, particularly when deployed from an aerial platform allows the assessment of biomass through different techniques in the full plot rather than in subsamples. Moreover, there is the capacity to undertake repetitive measurements which may improve the estimation [4]. A priory the most canonical way to assess biomass is using LiDAR (Light Detection and Ranging) mounted in an aerial platform or in a “phenomobile” [26]. However still today the most common way to assess green biomass is through vegetation indices, either multispectral or RGB-derived, given the common perception these approaches being more affordable and easier to use than the LiDAR [26]. However, an inherent limitation of the vegetation indices is that they saturate, which makes its use less effective during the central part of the crop cycle, even when still is of value to assess early stages of growth or stay green. Moreover, vegetation indices do not inform about canopy height. Nevertheless, a more accurate determination of green biomass than that associated to vegetation indices, together with plant height may be also achieved using RGB images; this time through three-dimensional reconstruction of the crop canopy. This evaluation may improve further if canopy height is combined with the number and thickness of the stems, evaluated through high-resolution RGB images [23].
Another potential target for current phenotyping, which has been traditionally neglected, is the photosynthetic contribution of the ear to grain filling. While a recent study has confirmed that genotypic variability exists for this trait and moreover showing the first examples of HTFP for this trait [35], the advent on remote sensing techniques based on the combination of RGB imaging for in situ organ detection, together with thermal and/or multispectral imaging may allow in the near future the evaluation of this trait from aerial platforms [36].
However, there still exist several areas not fully explored in terms of HTFP protocols, such as root phenotyping, just one among many hidden yet very important attributes to consider in new potential phenotyping target traits.
9 Key Concepts
HTFP may contribute to speed genetic advances in different ways. Nevertheless, phenotyping under controlled conditions may still have applicability in some cases. Usually, there is not a single technological solution, but rather different options in terms of throughput and even cost are available. In this sense, affordable phenotyping techniques, including various sensors and platforms, are more approachable than ever before. Remote sensing techniques are the most commonly used for phenotyping but other approaches, like the lab-based traits may be also useful. Eventually, hyperspectral techniques may even replace many lab-based approaches. Besides that, image processing and even more data analysis, including prediction models are the actual components of the phenotyping pipeline that will allow full exploitation of new technological developments, in terms of traits and platforms, for HTFP.
10 Conclusions
As a take-home message, phenotyping is evolving very fast in terms of throughput, the range of traits that can be assessed, and the adaptation of the costs of sensors and platforms to a growing market for these technologies. However, the computing and statistical components still remain as the most commonly perceived bottleneck that currently limits HTFP from reaching full operability. This includes a wide range of areas: from automation of data capturing and further data processing, to the use of the data produced to drive prediction models or even its integration and application in genomic selection.
In this sense remote sensing techniques will become more accessible to breeders if image analysis services were to become more widely available, affordable and automatized (i.e., customer friendly), providing curated phenotypic data in near real time. As examples, on board data pre-processing and 5G in-flight data transmission are two of the main paths forward for simplified processing and improved usability of remote sensing sensors. Both go hand-in-hand with improvements in sensor-platform integration, in which the sensor and platform have become more and more interconnected and thus are able to share GPS, altimetry, IMU, power sources and transmission capacities for improved efficiency and operability. Manual UAV flights and separate manual programming of sensor data capture are already in the past. In many modern commercial UAVs, smartphone connectivity already converts the UAV controller to an all-in-one command station for programming flight paths, viewing UAV and sensor details in-flight, and even limited data viewing and downloading in real-time. Also, in smart sensors, such as the Tetracam MCAW system (https://www.tetracam.com/Products-Macaw.htm), images are calibrated to reflectance, corrected for parallax and combined into multiband TIFFs or even processed into programmable vegetation indexes in flight by the on-board micro-processor and fast solid state disk drives; these also include Wi-Fi to smartphone connectivity. Even though the current wireless connectivity of these can’t keep pace with the onboard data capture and automated pre-processing, both of these, including even UAV hyperspectral data, should be both processable and transmissible in real-time with 5G Wi-Fi, enabling the automation of the rest of the pre-processing, from Structure-to-Motion orthomosaicking and on to micro-plot extraction (given the proper GIS metadata), either in PC or cloud-based services inter-connected to UAV functionality or specific sensors or as a third party solution, such as DroneMapper, Pix4Dcloud, AgisoftCloud, Micasense AtlasCloud, DroneDeploy, and many more (http://dronemapper.com, https://www.pix4d.com/product/pix4dcloud, https://cloud.agisoft.com, https://atlas.micasense.com, see also https://micasense.com/software-solutions). Given that there are already precision agriculture crop pest/disease UAVs that can detect specific pest or disease presence or absence and spray with onboard imaging and artificial intelligence decision support, the next step for plant phenotyping must be close behind.
On the other a routinely assessment under field conditions of particular traits, relevant for grain and fodder quality (e.g., contents of amino acids, micronutrients, provitamins), or for HTFP in frontier areas such as the breeding for higher and more efficient photosynthesis. This will be feasible through hyperspectral techniques, providing not only computing capabilities are optimized, but also cost of hyperspectral sensors and imagers decrease.
References
Araus JL, Cairns JE (2014) Field high-throughput phenotyping: the new crop breeding frontier. Trends Plant Sci 19:52–61
Pauli D, Chapman SC, Bart R, Topp CN, Lawrence-Dill CJ, Poland J, Gore MA (2016) The quest for understanding phenotypic variation via integrated approaches in the field environment. Plant Physiol 172:00592.2016. https://doi.org/10.1104/pp.16.00592
Reynolds M, Chapman S, Crespo-Herrera L, Molero G, Mondal S, Pequeno DNL, Pinto F, Pinera-Chavez FJ, Poland J, Rivera-Amado C, Saint Pierre C, Sukumaran S (2020) Breeder friendly phenotyping. Plant Sci 295:110396. https://doi.org/10.1016/j.plantsci.2019.110396
Rebetzke GJ, Jimenez-Berni J, Fischer RA, Deery DM, Smith DJ (2019) High-throughput phenotyping to enhance the use of crop genetic resources. Plant Sci 282:40–48. https://doi.org/10.1016/j.plantsci.2018.06.017
Araus JL, Kefauver SC, Zaman-Allah M, Olsen MS, Cairns JE (2018) Translating high-throughput phenotyping into genetic gain. Trends Plant Sci 23:451–466. https://doi.org/10.1016/j.tplants.2018.02.001
Araus JL, Slafer GA, Royo C, Serret MD (2008) Breeding for yield potential and stress adaptation in cereals. Crit Rev Plant Sci 27:377–412. https://doi.org/10.1080/07352680802467736
Araus JL, Kefauver SC (2018) Breeding to adapt agriculture to climate change: affordable phenotyping solutions (Review). Curr Opin Plant Biol 45:237–247. https://doi.org/10.1016/j.pbi.2018.05.003
Sanchez-Bragado R, Newcomb M, Chairi F, Condorelli GE, Ward R, White JW, Maccaferri M, Tuberosa R, Araus JL, Serret MD (2020) Carbon isotope composition and the NDVI as phenotyping approaches for drought adaptation in durum wheat: beyond trait selection. Agronomy 10:1679. https://doi.org/10.3390/agronomy10111679
Araus JL, Kefauver SC, Zaman-Allah M, Olsen MS, Cairns JE (2018) Phenotyping: new crop breeding frontier. In: Meyers R (ed) Encyclopedia of sustainability science and technology. Springer, New York
Jin X, Zarco-Tejada P, Schmidhalter U, Reynolds MP, Hawkesford MJ, Varshney RK, Yang T, Nie C, Li Z, Ming B, Xiao Y, Xie Y, Li S (2020) High-throughput estimation of crop traits: a review of ground and aerial phenotyping platforms. IEEE Geosci Remote Sens:1–33. https://doi.org/10.1109/MGRS.2020.2998816
Reynolds D, Baret F, Welcker C, Bostrom A, Ball J, Cellini F, Lorence A, Chawade A, Khafif M, Noshita K, Mueller-Linow M, Zhou J, Tardieu F (2019) What is cost-efficient phenotyping? Optimizing costs for different scenarios. Plant Sci 282:14–22. https://doi.org/10.1016/j.plantsci.2018.06.015
Andrade-Sanchez P, Gore MA, Heun JT, Thorp KR, Carmo-Silva AE, French AN, Salvucci ME, White JW (2014) Development and evaluation of a field-based high-throughput phenotyping platform. Funct Plant Biol 41. https://doi.org/10.1071/FP13126
Aasen H, Honkavaara E, Lucieer A, Zarco-Tejada PJ (2018) Quantitative remote sensing at ultra-high resolution with UAV spectroscopy: a review of sensor technology, measurement procedures, and data correction workflows. Remote Sens 10:1091. https://doi.org/10.3390/rs10071091
Gracia-Romero A, Kefauver SC, Fernandez-Gallego JA, Vergara-Díaz O, Nieto-Taladriz MT, Araus JL (2019) UAV and ground image-based phenotyping: a proof of concept with durum wheat. Remote Sens 11:1244. https://doi.org/10.3390/rs11101244
Maes WH, Steppe K (2019) Perspectives for remote sensing with unmanned aerial vehicles in precision agriculture. Trends Plant Sci 24:152–164. https://doi.org/10.1016/j.tplants.2018.11.007
Eskandari R, Mahdianpari M, Mohammadimanesh F, Salehi B, Brisco B, Homayouni S (2020) Meta-analysis of unmanned aerial vehicle (UAV) imagery for agro-environmental monitoring using machine learning and statistical models. Remote Sens 12:3511. https://doi.org/10.3390/rs12213511
Bolger M, Schwacke R, Gundlach H, Schmutzer T, Chen J, Arend D, Opperman M, Weise S, Lange M, Fiorani F, Spannagl M, Scholz U, Mayer K, Usadel B (2017) From plant genomes to phenotypes. J Biotechnol 261:46–52. https://doi.org/10.1016/j.jbiotec.2017.06.003
Brown TB, Cheng R, Siriault XRR, Rungrat T, Murray KD, Trtilek M, Furbank RT, Badger M, Pogson BJ, Borevitz JO (2014) TraitCapture: genomic and environment modelling of plant phenomic data. Curr Opin Plant Biol 18:73–79. https://doi.org/10.1016/j.pbi.2014.02.002
Furbank RT, Tester M (2011) Phenomics–technologies to relieve the phenotyping bottleneck. Trends Plant Sci 16:635–644. https://doi.org/10.1016/j.tplants.2011.09.005
Roth L, Hund A, Aasen H (2018) PhenoFly Planning Tool: flight planning for high-resolution optical remote sensing with unmanned aerial systems. Plant Methods 14:116. https://doi.org/10.1186/s13007-018-0376-6
Singh A, Ganapathysubramanian B, Singh AK, Sarkar S (2016) Machine learning for high-throughput stress phenotyping in plants. Trends Plant Sci 21:110–124. https://doi.org/10.1016/j.tplants.2015.10.015
Fernandez-Gallego JA, Lootens P, Borra-Serrano I, Derycke V, Haesaert G, Roldán-Ruiz I, Araus JL, Kefauver SC (2020) Automatic wheat ear counting using machine learning based on RGB UAV imagery. Plant J 103:1603–1613. https://doi.org/10.1111/tpj.14799
Jin X, Madec S, Dutartre D, de Solan B, Comar A, Baret F (2019) High-throughput measurements of stem characteristics to estimate ear density and above-ground biomass. Plant Phenomics 2019:1–10. https://doi.org/10.34133/2019/4820305
Fernandez-Gallego JA, Kefauver SC, Vatter T, Aparicio Gutierrez N, Nieto-Taladriz MT, Araus JL (2019) Low-cost assessment of grain yield in durum wheat using RGB images. Eur J Agron 105:146–156. https://doi.org/10.1016/j.eja.2019.02.007
Gracia-Romero A, Verdara-Diaz O, Thierfelder C, Cairns JE, Kefauver SC, Araus JL (2018) Phenotyping conservation agriculture management effects on ground and aerial remote sensing assessments of maize hybrids performance in Zimbabwe. Remote Sens 10:349. https://doi.org/10.3390/rs10020349
Madec S, Baret F, de Solan B, Thomas S, Dutartre D, Jezequel S, Hemmerlé M, Colombeau G, Comar A (2017) High-throughput phenotyping of plant height: comparing unmanned aerial vehicles and ground LiDAR estimates. Front Plant Sci 8:1–14. https://doi.org/10.3389/fpls.2017.02002
Garbulsky MF, Peñuelas J, Gamon J, Inoue Y, Filella I (2011) The photochemical reflectance index (PRI) and the remote sensing of leaf, canopy and ecosystem radiation use efficiencies. A review and meta-analysis. Remote Sens Environ 115:281–297. https://doi.org/10.1016/j.rse.2010.08.023
Vergara-Diaz O, Vatter T, Kefauver SC, Obata T, Fernie AR, Araus JL (2020) Assessing durum wheat ear and leaf metabolomes in the field through hyperspectral data. Plant J 102:615–630. https://doi.org/10.1111/tpj.14636
Ribes M, Russias G, Tregoat D, Fournier A (2020) Towards low-cost hyperspectral single-pixel imaging for plant phenotyping. Sensors 20:1132. https://doi.org/10.3390/s20041132
Verrelst J, Malenovský Z, Van der Tol C, Camps-Valls G, Gastellu-Etchegorry JP, Lewis P, North P, Moreno J (2019) Quantifying vegetation biophysical variables from imaging spectroscopy data: a review on retrieval methods. Surv Geophys 40:589–629. https://doi.org/10.1007/s10712-018-9478-y
Schaepman ME, Ustin SL, Plaza AJ, Painter TH, Verrelst J, Liang S (2009) Earth system science related imaging spectroscopy—an assessment. Remote Sens Environ 113:S123–S137. https://doi.org/10.1016/j.rse.2009.03.001
Roche D (2015) Stomatal conductance is essential for higher yield potential of C3 crops. Plant Sci 34:429–453. https://doi.org/10.1080/07352689.2015.1023677
Carmo-Silva E, Andralojc PJ, Scales JC, Driever SM, Mead A, Lawson T, Raines CA, Parry MAJ (2017) Phenotyping of field-grown wheat in the UK highlights contribution of light response of photosynthesis and flag leaf longevity to grain yield. J Exp Bot 68:3473–3486. https://doi.org/10.1093/jxb/erx169
Lopes MS, Reynolds MP (2012) Stay-green in spring wheat can be determined by spectral reflectance measurements (normalized difference vegetation index) independently from phenology. J Exp Bot 63:3789–3798. https://doi.org/10.1093/jxb/ers071
Molero G, Reynolds MP (2020) Spike photosynthesis measured at high throughput indicates genetic variation independent of flag leaf photosynthesis. Field Crop Res 255:107866. https://doi.org/10.1016/j.fcr.2020.107866
Sanchez-Bragado R, Vicente R, Molero G, Serret MD, Maydup ML, Araus JL (2020) New avenues for increasing yield and stability in C3 cereals: exploring ear photosynthesis. Curr Opin Plant Biol 56:223–234. https://doi.org/10.1016/j.pbi.2020.01.001
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
Copyright information
© 2022 The Author(s)
About this chapter
Cite this chapter
Araus, J.L., Buchaillot, M.L., Kefauver, S.C. (2022). High Throughput Field Phenotyping. In: Reynolds, M.P., Braun, HJ. (eds) Wheat Improvement. Springer, Cham. https://doi.org/10.1007/978-3-030-90673-3_27
Download citation
DOI: https://doi.org/10.1007/978-3-030-90673-3_27
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-90672-6
Online ISBN: 978-3-030-90673-3
eBook Packages: Biomedical and Life SciencesBiomedical and Life Sciences (R0)