GRASPY – Object Manipulation with NAO

  • Judith Müller
  • Udo Frese
  • Thomas Röfer
  • Rodolphe Gelin
  • Alexandre Mazel
Conference paper
Part of the Springer Tracts in Advanced Robotics book series (STAR, volume 94)

Abstract

In this paper we introduce an online object manipulation system for the NAO robot that is able to detect and grasp an object out of a human hand and then give it back in real-time. Known objects are rendered from 3D models and detected stereo contour-based by using a new stereo vision head for NAO. In order to grasp objects, motion trajectories are generated by an A* planner while avoiding obstacles. In order to safely release objects back into a human hand, a combination of tactile and force sensors of the carrying arm is used to detect whether someone touched the grasped object. We performed quantitative experiments in order to evaluate the quality of the detector, the time to grasp an object, as well as the number of successful grasps. We demonstrated the whole system on the real robot.

Keywords

stereo vision object detection online grasp motion planning 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Gouaillier, D., Hugel, V., Blazevic, P., Kilner, C., Monceaux, J., Lafourcade, P., Marnier, B., Serre, J., Maisonnier, B.: Mechatronic design of NAO humanoid. In: IEEE International Conference on Robotics and Automation, ICRA 2009, pp. 769–774 (May 2009)Google Scholar
  2. 2.
    Müller, J., Frese, U., Röfer, T.: Grab a mug - object detection and grasp motion planning with the Nao robot. In: IEEE-RAS International Conference on Humanoid Robots (2012)Google Scholar
  3. 3.
    Azad, P., Asfour, T., Dillmann, R.: Stereo-based 6D object localization for grasping with humanoid robot systems. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2007, October 29-November 2, pp. 919–924 (2007)Google Scholar
  4. 4.
    Cotugno, G., Mellmann, H.: Dynamic motion control: Adaptive bimanual grasping for a humanoid robot. In: Proceedings of the Workshop on Concurrency, Specification, and Programming CS&P 2010, Börnicke (near Berlin), Germany, vol. 2 (September 2010)Google Scholar
  5. 5.
    Stuckler, J., Holz, D., Behnke, S.: Robocup@home: Demonstrating everyday manipulation skills in robocup@home. IEEE Robotics Automation Magazine 19(2), 34–42 (2012)CrossRefGoogle Scholar
  6. 6.
    Hegger, F., Mueller, C.A., Jin, Z., Alvarez Ruiz, J., Giorgana, G.R.G., Hochgeschwender, N., Reckhaus, M., Paulus, J., Ploeger, P.G., Kraetzschmar, G.K.: The b-it-bots robocup@home 2011 team description paper. In: Proc. RoboCup Symp. (2011)Google Scholar
  7. 7.
    Csurka, G., Dance, C.R., Fan, L., Willamowski, J., Bray, C.: Visual categorization with bags of keypoints. In: Workshop on Statistical Learning in Computer Vision, ECCV, pp. 1–22 (2004)Google Scholar
  8. 8.
    Stückler, J., Steffens, R., Holz, D., Behnke, S.: Efficient 3d object perception and grasp planning for mobile manipulation in domestic environments. Robotics and Autonomous Systems (2012)Google Scholar
  9. 9.
    Fischler, M.A., Bolles, R.C.: Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 24(6), 381–395 (1981)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Rusu, R., Blodow, N., Marton, Z., Beetz, M.: Close-range scene segmentation and reconstruction of 3d point cloud maps for mobile manipulation in domestic environments. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2009, pp. 1–6 (2009)Google Scholar
  11. 11.
    Hsiao, K., Chitta, S., Ciocarlie, M., Jones, E.: Contact-reactive grasping of objects with partial shape information. In: 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1228–1235 (2010)Google Scholar
  12. 12.
    Stückler, J., Behnke, S.: Following human guidance to cooperatively carry a large object. In: IEEE-RAS Int. Conf. Humanoid Robots (Humanoids), Bled, Slowenia, pp. 218–223 (2011)Google Scholar
  13. 13.
    Kuffner Jr., J.J., Kagami, S., Nishiwaki, K., Inaba, M., Inoue, H.: Dynamically-stable motion planning for humanoid robots. Autonomous Robots 12(1), 105–118 (2002)CrossRefMATHGoogle Scholar
  14. 14.
    Burget, F., Hornung, A., Bennewitz, M.: Whole-body motion planning for manipulation of articulated objects. In: Proceedings of the IEEE International Conference on Robotics and Automation, ICRA (2013)Google Scholar
  15. 15.
    Kavraki, L., Svestka, P., Latombe, J.C., Overmars, M.: Probabilistic roadmaps for path planning in high-dimensional configuration spaces. IEEE Transactions on Robotics and Automation 12(4), 566–580 (1996)CrossRefGoogle Scholar
  16. 16.
    Harada, K., Kaneko, K., Kanehiro, F.: Fast grasp planning for hand/arm systems based on convex model. In: IEEE International Conference on Robotics and Automation, ICRA 2008, pp. 1162–1168 (May 2008)Google Scholar
  17. 17.
    Vahrenkamp, N., Asfour, T., Dillmann, R.: Simultaneous grasp and motion planning: Humanoid robot armar-iii. IEEE Robotics Automation Magazine 19(2), 43–57 (2012)CrossRefGoogle Scholar
  18. 18.
    Zacharias, F., Borst, C., Hirzinger, G.: Object-specific grasp maps for use in planning manipulation actions. In: Kröger, T., Wahl, F.M. (eds.) Advances in Robotics Research, pp. 203–213. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  19. 19.
    Zacharias, F., Borst, C., Hirzinger, G.: Capturing robot workspace structure: representing robot capabilities. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2007, October 29-November 2, pp. 3229–3236 (2007)Google Scholar
  20. 20.
    Lalibertže, K., Birglen, L., Gosselin, C.M.: Underactuation in robotic grasping hands. Journal of Machine Intelligence and Robotic Control 4, 77–87 (2002)Google Scholar
  21. 21.
    Borst, C., Fischer, M., Hirzinger, G.: Grasping the dice by dicing the grasp. In: Proceedings. 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2003, vol. 4, pp. 3692–3697 (October 2003)Google Scholar
  22. 22.
    Kragten, G., Kool, A., Herder, J.: Ability to hold grasped objects by underactuated hands: Performance prediction and experiments. In: IEEE International Conference on Robotics and Automation, ICRA 2009, pp. 2493–2498 (May 2009)Google Scholar
  23. 23.
    Aptina, http://www.aptina.com/products/soc/mt9m114/ (accessed: June 03, 2013)
  24. 24.
    DeRose, T.D., Barsky, B.A.: Geometric continuity, shape parameters, and geometric constructions for Catmull-Rom splines. ACM Trans. Graph. 7(1), 1–41 (1988)CrossRefMATHGoogle Scholar
  25. 25.
    Müller, J., Laue, T., Röfer, T.: Kicking a ball – modeling complex dynamic motions for humanoid robots. In: Ruiz-del-Solar, J., Chown, E., Ploeger, P.G. (eds.) RoboCup 2010. LNCS (LNAI), vol. 6556, pp. 109–120. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  26. 26.
    Itoh, K., Ohno, Y.: A curve fitting algorithm for character fonts. Electronic Publishing 6(3), 195–198 (1993)Google Scholar
  27. 27.
    Herold, J.: Least squares Bezier fit, http://jimherold.com/2012/04/20/least-squares-bezier-fit/ (accessed: June 03, 2013)
  28. 28.
    Press, W.H., Teukolsky, S.A., Vetterling, W.T., Flannery, B.P.: Numerical recipes in C the art of scientific computing, 2nd edn. Cambridge University Press, New York (1992)MATHGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Judith Müller
    • 1
  • Udo Frese
    • 2
  • Thomas Röfer
    • 2
  • Rodolphe Gelin
    • 3
  • Alexandre Mazel
    • 3
  1. 1.Transregional Collaborative Research Center for Spatial Cognition, SFB/TR8University of BremenBremenGermany
  2. 2.Cyber Physical SystemsDeutsches Forschungszentrum für Künstliche IntelligenzBremenGermany
  3. 3.Aldebaran RoboticsParisFrance

Personalised recommendations