Advertisement

Accurate Pouring with an Autonomous Robot Using an RGB-D Camera

  • Chau Do
  • Wolfram Burgard
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 867)

Abstract

Robotic assistants in a home environment are expected to perform various complex tasks for their users. One particularly challenging task is pouring drinks into cups, which for successful completion, requires the detection and tracking of the liquid level during a pour to determine when to stop. In this paper, we present a novel approach to autonomous pouring that tracks the liquid level using an RGB-D camera and adapts the rate of pouring based on the liquid level feedback. We thoroughly evaluate our system on various types of liquids and under different conditions, conducting over 250 pours with a PR2 robot. The results demonstrate that our approach is able to pour liquids to a target height with an accuracy of a few millimeters.

Keywords

Liquid perception Robot pouring Household robotics 

References

  1. 1.
    Pan, Z., Manocha, D.: Motion planning for fluid manipulation using simplified dynamics. In: Intelligent Robots and Systems (IROS), pp. 4224–4231 (2016)Google Scholar
  2. 2.
    Tamosiunaite, M., Nemec, B., Ude, A., Wörgötter, F.: Learning to pour with a robot arm combining goal and shape learning for dynamic movement primitives. Robot. Auton. Syst. 59, 910–922 (2011)CrossRefGoogle Scholar
  3. 3.
    Langsfeld, J., Kaipa, K., Gentili, R., Reggia, J., Gupta, S.: Incorporating failure-to-success transitions in imitation learning for a dynamic pouring task. In: Workshop on Compliant Manipulation: Challenges and Control (2014)Google Scholar
  4. 4.
    Rozo, L., Jiménez, P., Torras, C.: Force-based robot learning of pouring skills using parametric hidden Markov models. Robot Motion and Control (RoMoCo), pp. 224–232 (2013)Google Scholar
  5. 5.
    Elbrechter, C., Maycock, J., Haschke, R., Ritter, H.: Discriminating liquids using a robotic kitchen assistant. In: Intelligent Robots and Systems (IROS), pp. 703–708 (2015)Google Scholar
  6. 6.
    Morris, N., Kutulakos, K.: Dynamic refraction stereo. IEEE Trans. Pattern Anal. Mach. Intell. 33, 1518–1531 (2011)CrossRefGoogle Scholar
  7. 7.
    Mottaghi, R., Schenck, C., Fox, D., Farhadi, A.: See the glass half full: reasoning about liquid containers, their volume and content. arXiv preprint arXiv:1701.02718 (2017)
  8. 8.
    Yamaguchi, A., Atkeson, C., Ogasawara, T.: Pouring skills with planning and learning modeled from human demonstrations. Int. J. Humanoid Rob. 12, 1550030 (2015)CrossRefGoogle Scholar
  9. 9.
    Yamaguchi, A., Atkeson, C.: Stereo vision of liquid and particle flow for robot pouring. In: Humanoid Robots (Humanoids), pp. 1173–1180 (2016)Google Scholar
  10. 10.
    Yoshitaka, H., Fuhito, H., Takashi, T., Akihisa, O.: Detection of liquids in cups based on the refraction of light with a depth camera using triangulation. In: Intelligent Robots and Systems (IROS), pp. 5049–5055 (2014)Google Scholar
  11. 11.
    Do, C., Schubert, T., Burgard, W.: A probabilistic approach to liquid level detection in cups using an RGB-D camera. In: Intelligent Robots and Systems (IROS), pp. 2075–2080 (2016)Google Scholar
  12. 12.
    Schenck, C., Fox, D.: Detection and tracking of liquids with fully convolutional networks. arXiv preprint arXiv:1606.06266 (2016)
  13. 13.
    Schenck, C., Fox, D.: Visual closed-loop control for pouring liquids. In: Robotics and Automation (ICRA), pp. 2629–2636 (2017)Google Scholar
  14. 14.
    Fischler, M., Bolles, R.: Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 24, 381–395 (1981)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Andersen, M.R., Jensen, T., Lisouski, P., Mortensen, A.K., Hansen, M.K., Gregersen, T., Ahrendt, P.: Kinect depth sensor evaluation for computer vision applications. Technical report ECE-TR-6 (2012)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.University of Freiburg, Autonomous Intelligent SystemsFreiburgGermany

Personalised recommendations