Skip to main content
Log in

Registration of multispectral 3D points for plant inspection

  • Published:
Precision Agriculture Aims and scope Submit manuscript

Abstract

Machine vision technologies have shown advantages for efficient and accurate plant inspection in precision agriculture. Regarding the balance between accuracy of inspection and compactness for infield applications, multispectral imaging systems would be more suitable than RGB colour cameras or hyperspectral imaging systems. Multispectral image registration (MIR) is a key issue for multispectral imaging systems, however, this task is challenging. First of all, in many cases, two images needing registration do not have a one-to-one linear mapping in 2D space and therefore they cannot be aligned in 2D images. Furthermore, the general MIR algorithms are limited to images with uniform intensity and are incapable of registering images with rich features. This study developed a machine vision system (MVS) and a MIR method which replaces 2D-2D image registration by 3D-3D point cloud registration. The system can register 3D point clouds of ultraviolet (UV), blue, green, red and near-infrared (NIR) spectra in 3D space. It was found that the point clouds of general plants created by images of different spectral bands have a complementary property, and therefore a combined point cloud, called multispectral 3D point cloud, is denser than any cloud created by a single spectral band. Intensity information of each spectral band is available in a multispectral 3D point cloud and therefore image fusion and 3D morphological analysis can be conducted in the cloud. The MVS could be used as a sensor of a robotic system to fulfil on-the-go infield plant inspection tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  • Baggio, D. L., Emami, S., & Escrivá, D. M. (2012). Exploring structure from motion using OpenCV. In Mastering openCV with practical computer vision projects (pp.121), Birmingham, UK: Packt Publishing Ltd, ISBN:978-1-84951-782-9.

  • Bay, H., Ess, A., Tuytelaars, T., & Van Gool, L. (2008). Speeded-Up Robust Features (SURF). Computer Vision and Image Understanding, 110(3), 346–359. doi:10.1016/j.cviu.2007.09.014.

    Article  Google Scholar 

  • Bradski, G., & Adrian, K. (2008). Projection and 3D vision. In Learning OpenCV (pp.405), Sebastopol, CA, USA: O’ Reilly Media, Inc, ISBN:978-0-596-51613-0.

  • Brown, M., & Susstrunk, S. (2011). Multi-spectral SIFT for scene category recognition. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 177–184, IEEE, doi:10.1109/CVPR.2011.5995637, ISBN:978-1-4577-0394-2.

  • Chaivivatrakul, S., & Dailey, M. N. (2014). Texture-based fruit detection. Precision Agriculture, 15(6), 662–683. doi:10.1007/s11119-014-9361-x.

    Article  Google Scholar 

  • Cubero, S., Aleixos, N., Albert, F., Torregrosa, A., Ortiz, C., García-Navarrete, O., et al. (2014). Optimised computer vision system for automatic pre-grading of citrus fruit in the field using a mobile platform. Precision Agriculture, 15(1), 80–94. doi:10.1007/s11119-013-9324-7.

    Article  Google Scholar 

  • Fan, X. (2011), Automatic Registration of Multi-Modal Airborne Imagery. PhD thesis, Rochester Institute of Technology, Ann Arbor, United States, ISBN: 9781124663555.

  • Ferhat, K., Won, L., & Ali, V. (2014). Immature peach detection in colour images acquired in natural illumination conditions using statistical classifiers and neural network. Precision Agriculture, 15(1), 57–79. doi:10.1007/s11119-013-9323-8.

    Article  Google Scholar 

  • Firmenichy, D., Brown, M., & Susstrunk, S. (2011). Multispectral interest points for RGB-NIR image registration. In IEEE International Conference on Image Processing (ICIP), pp. 181–184, IEEE, doi:10.1109/ICIP.2011.6115818, ISBN:978-1-4577-1304-0.

  • Haff, R., Saranwong, S., Thanapase, W., Janhiran, A., Kasemsumran, S., & Kawano, S. (2013). Automatic image analysis and spot classification for detection of fruit fly infestation in hyperspectral images of mangoes. Postharvest Biology and Technology, 86, 23–28.

    Article  Google Scholar 

  • Haghighat, M., Aghagolzadeh, A., & Seyedarabi, H. (2011). Multi-focus image fusion for visual sensor networks in DCT domain. Computers & Electrical Engineering, 37(5), 789–797. doi:10.1016/j.compeleceng.2011.04.016.

    Article  Google Scholar 

  • Han, J., Pauwels, E. J., & de Zeeuw, P. (2013). Visible and infrared image registration in man-made environments employing hybrid visual features. Pattern Recognition Letters, 34(1), 42–51. doi:10.1016/j.patrec.2012.03.022.

    Article  Google Scholar 

  • Harris, C., & Stephens, M. (1988). A combined corner and edge detector. In Fourth Alvey Vision Conference, pp.147–151, Durham, UK: British Machine Vision Association and Society for Pattern Recognition, http://www.bmva.org/bmvc/1988/avc-88-023.pdf.

  • Hartley, R., & Zisserman, A. (2004). Multiple View Geometry in Computer Vision. New York, USA: Cambridge University Press. ISBN 0-521-54051-8.

    Book  Google Scholar 

  • Hasan, M., & Jia, X. (2012), Reliable Multi-Modal Automatic Remote Sensing Image Registration. PhD Thesis, Engineering & Information Technology, University of New South Wales, Canberra, Australia,

  • Hoel, P. G. (1943). On indices of dispersion. The Annals of Mathematical Statistics, 14(2), 155–162. doi:10.1214/aoms/1177731457.

    Article  Google Scholar 

  • Lee, D., Hofmann, M., Steinke, F., Altun, Y., Schölkopf, B., & Cahill, N. D. (2009). Learning similarity measure for multi-modal 3D image registration. In IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, pp. 186–193, IEEE, doi:10.1109/CVPRW.2009.5206840, ISBN: 9781424439935.

  • Li, H., & Hartley, R. (2007). The 3D-3D Registration Problem Revisited. In IEEE 11th International Conference on Computer Vision, pp. 1–8, doi:10.1109/ICCV.2007.4409077.

  • Li, P., Lee, S., & Hsu, H. (2011). Use of a cold mirror system for citrus fruit identification. In IEEE International Conference of Computer Science and Automation Engineering (CSAE) pp. 376–381, IEEE, doi:10.1109/CSAE.2011.5952491, ISBN:1424487277, 9781424487271.

  • Li, P., Lee, S. H., & Hsu, H. Y. (2012). Fusion on citrus image data from cold mirror acquisition system. Computer Vision and Image Processing, 2, 12–26. doi:10.4018/ijcvip.2012100102.

    Article  Google Scholar 

  • Li, H., Lee, W. S., & Wang, K. (2016). Immature green citrus fruit detection and counting based on fast normalized cross correlation (FNCC) using natural outdoor colour images. Precision Agriculture. doi:10.1007/s11119-016-9443-z.

    Google Scholar 

  • Liu, H., & Lee, S. (2015), Stitching of video sequences for weed mapping, Paper presented at the International Conference on Intelligent Information Hiding and Multimedia Signal Processing (IIH-MSP), IEEE, Adelaide, Australia. doi:10.1109/IIH-MSP.2015.106.

  • Liu, H., Lee, S. H., & Chah, J. S. (2016a). An evaluation of the contribution of ultraviolet in fused multispectral images for invertebrate detection on green leaves. Precision Agriculture. doi:10.1007/s11119-016-9472-7.

    Google Scholar 

  • Liu, H., Lee, S. H., & Chahl, J. S. (2016b). A review of recent sensing technologies to detect invertebrates on crops. Precision Agriculture. doi:10.1007/s11119-016-9473-6.

    Google Scholar 

  • Liu, H., Lee, S., & Chahl, J. (2017). Transformation of a high-dimensional color space for material classification. Journal of the Optical Society of America A, 34(4), 523–532. doi:10.1364/josaa.34.000523.

    Article  Google Scholar 

  • Liu, H., Lee, S. H., & Saunders, C. (2014). Development of a machine vision system for weed detection during both off-season and in-season in broadacre no-tillage cropping lands. American Journal of Agricultural and Biological Sciences, 9(2), 174–193. doi:10.3844/ajabssp.2014.174.193.

    Article  CAS  Google Scholar 

  • Liu, H., Li, P., Saunders, C., & Lee, S. H. (2013a). Development of a green plant image segmentation method of machine vision system for no-tillage fallow weed detection. In Society for Engineering in Agriculture Conference: innovative agricultural technologies for a sustainable future, pp. 95–108, Canberra, Australia: Engineers Australia.

  • Liu, H., Saunders, C., & Lee, S. H. (2013b). Development of a proximal machine vision system for off-season weed mapping in broadacre no-tillage fallows. Journal of Computer Science, 9, 1803–1821. doi:10.3844/jcssp.2013.1803.1821.

    Article  Google Scholar 

  • Lowe, D. (2004). Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision, 60, 91–110.

    Article  Google Scholar 

  • Lucas, B.D., & Kanade, T. (1981). An Iterative Image Registration Technique with an Application to Stereo Vision. In Proceedings of the 7th International Joint Conference on Artificial Intelligence, pp. 674–679, San Francisco, CA, USA: Morgan Kaufmann Publishers Inc.

  • MathWorks. (2017). Computer vision system toolbox. MathWorks, https://au.mathworks.com/products/computer-vision.html.

  • Nieuwenhuizen, A. T., Hofstee, J. W., & Henten, E. J. (2010). Adaptive detection of volunteer potato plants in sugar beet fields. Precision Agriculture, 11(5), 433–447. doi:10.1007/s11119-009-9138-9.

    Article  Google Scholar 

  • Nieuwenhuizen, A. T., Tang, L., Hofstee, J. W., Müller, J., & Van Henten, E. J. (2007). Colour based detection of volunteer potatoes as weeds in sugar beet fields using machine vision. Precision Agriculture, 8, 267–278. doi:10.1007/s11119-007-9044-y.

    Article  Google Scholar 

  • Pluim, J. P. W., Maintz, J. B. A., & Viergever, M. A. (2003). Mutual-information-based registration of medical images: A survey. IEEE Transactions on Medical Imaging, 22(8), 986–1004. doi:10.1109/TMI.2003.815867.

    Article  PubMed  Google Scholar 

  • Pollefeys, M. (1999). Self-calibration and metric 3D reconstruction from uncalibrated image sequences PhD thesis, ISBN: 90-5682-193-8, University of Leuven, Belgium.

  • Qureshi, W. S., Payne, A., Walsh, K. B., Linker, R., Cohen, O., & Dailey, M. N. (2016). Machine vision for counting fruit on mango tree canopies. Precision Agriculture. doi:10.1007/s11119-016-9458-5.

    Google Scholar 

  • Rosten, E., Porter, R., & Drummond, T. (2010). Faster and better: A machine learning approach to corner detection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(1), 105–119. doi:10.1109/TPAMI.2008.275.

    Article  PubMed  Google Scholar 

  • Shen, X., Zhang, Q., Jia, J., & Xu, L. (2014). Multi-modal and multi-spectral registration for natural images. In 13th European Conference of Computer Vision (ECCV) pp. 309–324, Cham, Switzerland: Springer, doi: https://doi.org/10.1007/978-3-319-10593-2_21.

  • Shi, J., & Tomasi, C. (1994). Good features to track. In IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 593–600, IEEE, doi:10.1109/CVPR.1994.323794, ISBN:0-8186-5825-8.

  • Torr, P. H. S., & Zisserman, A. (2000). MLESAC: A new robust estimator with application to estimating image geometry. Computer Vision and Image Understanding, 78(1), 138–156. doi:10.1006/cviu.1999.0832.

    Article  Google Scholar 

  • Wang, J., Nakano, K., Ohashi, S., Kubota, Y., Takizawa, K., & Sasaki, Y. (2011). Detection of external insect infestations in jujube fruit using hyperspectral reflectance imaging. Biosystems Engineering, 108(4), 345–351.

    Article  Google Scholar 

  • Woo, J., Stone, M., & Prince, J. L. (2015). Multimodal registration via mutual information incorporating geometric and spatial context. IEEE Transactions on Image Processing, 24(2), 757–769. doi:10.1109/TIP.2014.2387019.

    Article  PubMed  PubMed Central  Google Scholar 

  • Zhang, Z. (2000). A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(11), 1330–1334.

    Article  Google Scholar 

  • Zhang, Y., Slaughter, D. C., & Staab, E. S. (2012). Robust hyperspectral vision-based classification for multi-season weed mapping. Journal of Photogrammetry and Remote Sensing, 69, 65–73. doi:10.1016/j.isprsjprs.2012.02.006.

    Article  Google Scholar 

Download references

Acknowledgements

This research was supported by Project Tyche, the trusted autonomy initiative of the Defence Science and Technology Group, Australian Department of Defence.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Huajian Liu.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, H., Lee, SH. & Chahl, J.S. Registration of multispectral 3D points for plant inspection. Precision Agric 19, 513–536 (2018). https://doi.org/10.1007/s11119-017-9536-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11119-017-9536-3

Keywords

Navigation