Skip to main content

Method of Visually Tracking Plant Main-Stem for Tomato’s Robotic Management

  • Conference paper
  • First Online:
Advances in Guidance, Navigation and Control

Part of the book series: Lecture Notes in Electrical Engineering ((LNEE,volume 644))

Abstract

In view of the need for acquiring tomato plant visual information for intelligent management in greenhouse, the method of tracking its main-stem based on visual servo technology was researched in this paper, which was supposed to improve the search efficiency for the targets, such as leaf, fruit and flower. According to the tomato factory-planted condition in the greenhouse, a binocular pan-tilt vision unit was designed, and the servo control method for tacking the main-stem was proposed, with which the bottom-up multi-views images were captured. Through matching the overlapped area of the adjacent images, the main-stem’ discrete images were spliced so that the plant morphology could be recovered. Finally the method was tested in greenhouse, and the results showed that the main-stem’s average splicing deviation was 3.77° in the height range from 600 to 1500 mm above the ground. The research result was supposed as a technical support for developing the robot for tomato pruning, harvesting and pollinating.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 429.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 549.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 549.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Mu, Y.: Economic research on vegetable industry in Beijing. China Agriculture Press, Beijing (2013)

    Google Scholar 

  2. Zhang T., Yang L., Chen B.: Research process of agricultural robot technology. Sci. China 40(Sup.), 71–87 (2010)

    Google Scholar 

  3. Li, P., Lee, S., Hsu, H.: Review on fruit harvesting method for potential use of automatic fruit harvesting systems. Procedia Eng. 23, 351–366 (2011)

    Google Scholar 

  4. Bechar, A., Vigneault, C.: Agricultural robots for field operations: Concepts and components. Biosys. Eng. 149, 94–111 (2016)

    Google Scholar 

  5. Hamuda, E., Glavin, M., Jones, E.: A survey of image processing techniques for plant extraction and segmentation in the field. Comput. Electron. Agriculture 125, 184–199 (2016)

    Google Scholar 

  6. Liu, G., Zhang, X., Zong, Z., Guo, C.: 3D reconstruction of strawberry based on depth information. Trans. Chinese Soc. Agricultural Mach. 48(04), 160–165 (2017)

    Google Scholar 

  7. Sun, G., Wang, X., Liu, J.: Multi-modal three-dimensional reconstruction of greenhouse tomato plants based on phase-correlation method. Trans. Chinese Soc. Agricultural Eng. 25(18), 134–142 (2019)

    Google Scholar 

  8. Xiao, S., Liu, S., Li, S.: Multi-view geometric reconstruction of plant based on improved region-growing algorithm. Scientia Agricultura Sinica 52(16), 2776–2786 (2019)

    Google Scholar 

  9. Song, F., Yang, Y., Yang, K.: Low-altitude remote sensing image registration algorithm based on dual-feature for arable land in hills and mountains. J. Beijing Univ. Aeronaut. Astronaut. 44(9), 1952–1963 (2018)

    Google Scholar 

  10. Zou, J., Li, D.: Depth information acquisition method based on stereo matching of array images. J. North China Univ. Technol. 28(03), 8–14 (2016)

    Google Scholar 

  11. Shi, C., Zhang, L., Yan, J., Ye, N.: Industrial photogrammetry technology and its implementation for large-scale equipment. Aeronaut. Manuf. Technol. 61(19), 24–30 (2018)

    Google Scholar 

  12. Chen S., Wang S.: Research on intelligent control method for moving object tracking based on PTZ camera. Comput. Sci. 42( Sup.2), 135–139 (2015)

    Google Scholar 

  13. Zhang, B., Du, Z., Bao, K.: Targets tracking system of two⁃axis PTZ based on machine vision. Electron. Design Eng. 27(12), 152–157 (2019)

    Google Scholar 

  14. Guan, X., Lv, Y., Zhang, L.: Compensation control of miniature UAV gimbal performing searching and tracking tasks. Inf. Control 45(05), 537–543 (2016)

    Google Scholar 

  15. Zhu, L., Wu, X., Li, J., Wu, X.: The Euler rotation and dynamic equation of rectangular coordinat system. Hydrogr. Surveying Charting 30(03), 20–22 (2010)

    Google Scholar 

  16. Feng, Q., Zhao, C., Wang, X., Wang, X., Gong, L., Liu, C.: Fruit Bunch Measurement Method for Cherry Tomato Based on Visual Servo 31(16), 206–212 (2015)

    Google Scholar 

  17. Liu, J., Zhang, Q.: Stereo matching algorithm of adaptive window based on gradient. Comput. Modernization 01, 67–39 (2012)

    Google Scholar 

Download references

Acknowledgements

We acknowledge that this work was financially supported by National Key Research and Development Plan (2019YFE0125200), and BAAFS Innovation Capacity Building Project (KJCX20210414).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Qingchun Feng .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Feng, Q., Wang, X., Chen, J. (2022). Method of Visually Tracking Plant Main-Stem for Tomato’s Robotic Management. In: Yan, L., Duan, H., Yu, X. (eds) Advances in Guidance, Navigation and Control . Lecture Notes in Electrical Engineering, vol 644. Springer, Singapore. https://doi.org/10.1007/978-981-15-8155-7_207

Download citation

Publish with us

Policies and ethics