Skip to main content
Log in

Proof-of-concept robot platform for exploring automated harvesting of sugar snap peas

  • Published:
Precision Agriculture Aims and scope Submit manuscript

Abstract

Currently, sugar snap peas are harvested manually. In high-cost countries like Norway, such a labour-intensive practise implies particularly large costs for the farmer. Hence, automated alternatives are highly sought after. This project explored a concept for robotic autonomous identification and tracking of sugar snap pea pods. The approach was based on a combination of visible–near infrared reflection measurements and image analysis, along with visual servoing. A proof-of-concept harvesting platform was implemented by mounting a robotic arm with hand-mounted sensors on a mobile unit. The platform was tested under plastic greenhouse conditions on potted plants of the sugar snap pea variety Cascadia using LED-lights and a partial shade. The results showed that it was feasible to differentiate the pods from the surrounding foliage using the light reflection at the spectral range around 970 nm combined with elementary image segmentation and shape modelling methods. The proof-of-concept harvesting platform was tested on 48 representative agricultural environments comprising dense canopy, varying pod sizes, partial occlusions and different working distances. A set of 104 images were analysed during the teleoperation experiment. The true positive detection rate was 93 and 87% for images acquired at long distances and at close distances, respectively. The robot arm achieved a success rate of 54% for autonomous visual servoing to a pre-grasp pose around targeted pods on 22 untouched scenarios. This study shows the potential of developing a prototype robot for semi-automated sugar snap pea harvesting.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  • Bac, C. W., Henten, E. J., Hemming, J., & Edan, Y. (2014). Harvesting robots for high-value crops: State-of-the-art review and challenges ahead. Journal of Field Robotics, 31(6), 888–911.

    Article  Google Scholar 

  • Baeten, J., Donné, K., Boedrij, S., Beckers, W., & Claesen, E. (2008). Autonomous fruit picking machine: A robotic apple harvester. Field and service robotics (pp. 531–539). Berlin, Heidelberg: Germany: Springer.

    Chapter  Google Scholar 

  • Barnea, E., Mairon, R., & Ben-Shahar, O. (2016). Colour-agnostic shape-based 3D fruit detection for crop harvesting robots. Biosystems Engineering, 146, 57–70.

    Article  Google Scholar 

  • Barth, R., Hemming, J., & van Henten, E. J. (2016). Design of an eye-in-hand sensing and servo control framework for harvesting robotics in dense vegetation. Biosystems Engineering, 146, 71–84.

    Article  Google Scholar 

  • Bradski, G., & Kaehler, A. (2008). Learning OpenCV: Computer vision with the OpenCV library. Sebastopol, USA: O’Reilly Media, Inc.

    Google Scholar 

  • Bulanon, D. M., Burks, T. F., & Alchanatis, V. (2009). Image fusion of visible and thermal images for fruit detection. Biosystems Engineering, 103(1), 12–22.

    Article  Google Scholar 

  • Bulanon, D. M., Kataoka, T., Ota, Y., & Hiroma, T. (2002). AE—Automation and emerging technologies: A segmentation algorithm for the automatic recognition of Fuji apples at harvest. Biosystems Engineering, 83(4), 405–412.

    Article  Google Scholar 

  • Green Producers Cooperation, Norway. (2016). Retrieved March 22, 2017 from http://www.grontprodusentene.no/prisinformasjon-alle-kulturer (in Norwegian).

  • Hannan, M. W., & Burks, T. F. (2004). Current developments in automated citrus harvesting. Paper No. 043087. St. Joseph, MI, USA: ASABE.

  • Hannan, M. W., Burks, T. F., & Bulanon, D. M. (2007). A real-time machine vision algorithm for robotic citrus harvesting. Paper No. 073125. St. Joseph, MI, USA: ASABE.

  • Hayashi, S., Ganno, K., Ishii, Y., & Tanaka, I. (2002). Robotic harvesting system for eggplants. Japan Agricultural Research Quarterly, 36(3), 163–168.

    Article  Google Scholar 

  • Hayashi, S., Shigematsu, K., Yamamoto, S., Kobayashi, K., Kohno, Y., Kamata, J., et al. (2010). Evaluation of a strawberry-harvesting robot in a field test. Biosystems Engineering, 105(2), 160–171.

    Article  Google Scholar 

  • Hemming, J., Bac, C., van Tuijl, B., Barth, R., Bontsema, J., Pekkeriet, E., et al. (2014). A robot for harvesting sweet-pepper in greenhouses. In Proceedings of the international conference of agricultural engineering (pp. 1–8). Cranfield, UK: EurAgEng.

  • Jimenez, A. R., Ceres, R., & Pons, J. L. (2000). A survey of computer vision methods for locating fruit on trees. Transactions of the ASAE, 43(6), 1911–1920.

    Article  Google Scholar 

  • Kazmi, W., Foix, S., Alenyà, G., & Andersen, H. J. (2014). Indoor and outdoor depth imaging of leaves with time-of-flight and stereo vision sensors: Analysis and comparison. ISPRS Journal of Photogrammetry and Remote Sensing, 88, 128–146.

    Article  Google Scholar 

  • Kondo, N., & Endo, S. (1988). Calculation of the most suitable wavelength bands for discrimination between fruit and leaves using their spectral reflectance. Scientific Reports of the Faculty of Agriculture-Okayama University (Japan).

  • Kondo, N., Monta, M., & Fujiura, T. (1996). Fruit harvesting robots in Japan. Advances in Space Research, 18(1), 181–184.

    Article  CAS  PubMed  Google Scholar 

  • Kong, D. Y., Zhao, D. A., Zhang, Y., Wang, J. J., & Zhang, H. X. (2010). Research of apple harvesting robot based on least square support vector machine. International conference on electrical and control engineering ICECE (pp. 1590–1593). Wuhan, China: IEEE.

    Google Scholar 

  • Kusumam, K., Krajník, T., Pearson, S., Cielniak, G., & Duckett, T. (2016). Can you pick a broccoli? 3D-vision based detection and localisation of broccoli heads in the field. IEEE/RSJ international conference on intelligent robots and systems, IROS (pp. 646–651). Daejeon, Korea: IEEE.

    Google Scholar 

  • Lee, W. S. (2007). Multispectral imaging for in-field green citrus identification. Paper No. 073025. St. Joseph, MI, USA: ASABE.

  • Li, P., Lee, S. H., & Hsu, H. Y. (2011). Review on fruit harvesting method for potential use of automatic fruit harvesting systems. Procedia Engineering, 23, 351–366.

    Article  CAS  Google Scholar 

  • Li, B., Wang, M., & Wang, N. (2010). Development of a real-time fruit recognition system for pineapple harvesting robots. Paper No. 1009510. St. Joseph, MI, USA: ASABE.

  • Ling, P., Ehsani, R., Ting, K. C., Chi, Y. T., Ramalingam, N., Klingman, M. H., et al. (2004). Sensing and end-effector for a robotic tomato harvester. Paper No. 043088. St. Joseph, MI, USA: ASABE.

  • McIntyre, A. (2014). Report on the future of Europe’s horticulture sector—Strategies for growth. Report A7-0048/2014 of the Committee on Agriculture and Rural Development, European Parliament (p. 17). Retrieved March 22, 2017 from http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//TEXT+REPORT+A7-2014-0048+0+DOC+XML+V0//EN.

  • Mehta, S. S., & Burks, T. F. (2014). Vision-based control of robotic manipulator for citrus harvesting. Computers and Electronics in Agriculture, 102, 146–158.

    Article  Google Scholar 

  • Nalwa, V. S. (1993). A guided tour of computer vision (Vol. 1). Reading, MA, USA: Addison-Wesley.

    Google Scholar 

  • Noble, S. D., & Li, D. (2012). Segmentation of greenhouse cucumber plants in multi-spectral imagery. In International conference of agricultural engineering (pp. 1–6). Cranfield, UK: EurAgEng.

  • Okamoto, H., & Lee, W. S. (2009). Green citrus detection using hyperspectral imaging. Computers and Electronics in Agriculture, 66(2), 201–208.

    Article  Google Scholar 

  • Okamoto, H., & Lee, W. S. (2010). Machine vision for green citrus detection in tree images. Environmental Control in Biology, 48(2), 93–99.

    Article  Google Scholar 

  • Rakun, J., Stajnko, D., & Zazula, D. (2011). Detecting fruits in natural scenes by using spatial-frequency based texture analysis and multiview geometry. Computers and Electronics in Agriculture, 76(1), 80–88.

    Article  Google Scholar 

  • Scarfe, A. J., Flemmer, R. C., Bakker, H. H., & Flemmer, C. L. (2009). Development of an autonomous kiwifruit picking robot. 4th International conference on autonomous robots and agents, ICARA (pp. 380–384). IEEE: Wellington, New Zealand.

    Google Scholar 

  • Stoelen, M. F., Kusnierek, K., Tejada, V. F., Heiberg, N., Balaguer, C., & Korsaeth, A. (2015). Low-cost robotics for horticulture: A case study on automated sugar pea harvesting. In J. V. Stafford (Ed.), Proceedings of the 10th European conference on precision agriculture (pp. 283–290). Wageningen, The Netherlands: Wageningen Academic Publishers.

  • Tanigaki, K., Fujiura, T., Akase, A., & Imagawa, J. (2008). Cherry-harvesting robot. Computers and Electronics in Agriculture, 63(1), 65–72.

    Article  Google Scholar 

  • Thanh Nguyen, T., Vandevoorde, K., Wouters, N., Kayacan, E., De Baerdemaeker, J. G., & Saeys, W. (2016). Detection of red and bicoloured apples on tree with an RGB-D camera. Biosystems Engineering, 146, 33–44.

    Article  Google Scholar 

  • Van Henten, E. J., Hemming, J., Van Tuijl, B. A. J., Kornet, J. G., Meuleman, J., Bontsema, J., et al. (2002). An autonomous robot for harvesting cucumbers in greenhouses. Autonomous Robots, 13(3), 241–258.

    Article  Google Scholar 

  • Yuan, T., Li, W., Feng, Q., & Zhang, J. (2010). Spectral imaging for greenhouse cucumber fruit detection based on binocular stereovision. Paper No. 1009345. St. Joseph, MI, USA: ASABE.

  • Zhao, J., Tow, J., & Katupitiya, J. (2005). On-tree fruit recognition using texture properties and color data. IEEE/RSJ international conference on intelligent robots and systems, IROS (pp. 263–268). IEEE: Edmonton, Canada.

    Google Scholar 

Download references

Acknowledgements

The authors would like to acknowledge Unni Myrheim Roos and Torkel Gaardløs at NIBIO Apelsvoll, for their skilled and helpful technical assistance during the development and testing process of this study. Also thanks to Morten F. Johansen at Torbjørnrød Farm for providing feedback during the design process.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to V. F. Tejada.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tejada, V.F., Stoelen, M.F., Kusnierek, K. et al. Proof-of-concept robot platform for exploring automated harvesting of sugar snap peas. Precision Agric 18, 952–972 (2017). https://doi.org/10.1007/s11119-017-9538-1

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11119-017-9538-1

Keywords

Navigation