Skip to main content

Advertisement

Log in

Real-time arm motion imitation for human–robot tangible interface

  • Original Research Paper
  • Published:
Intelligent Service Robotics Aims and scope Submit manuscript

Abstract

In this paper we deal with a remote meeting system with tangible interface, in which a robot is used as tangible avatar instead of a remote meeting partner. For realizing such system, it is a critical issue how the robot imitates human motions with natural and exact. So, we suggested a new method that human arm motion is captured with a stereo vision system and transferred to the robotic avatar with real-time. For capturing 3D arm motions based on markerless method, we proposed a new metaball-based method which was designed in order to have some robust and efficient properties: a modified iso-surface equation of metaball for overcoming local minima and a downsizing method of 3D point cloud for improving time complexity. With our meeting system, we have implemented our new algorithm and run at approximately 12–16 Hz. Also, its accuracy in motion capturing could be acceptable for robot motion generation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Park S (2002) Tangible space initiative (TSI). In: Tangible space initiative workshop in the 8th international conference on virtual systems and multimedia (VSMM2002), Gyengju, Korea, Sept. 25–27

  2. Sminchisescu C, Kanaujia A, Li Z, Metaxas D (2005) Discriminative density propagation for 3D human motion estimation. Comput Soc Conf CVPR 1: 390–397

    Google Scholar 

  3. Agarwal A, Triggs B (2005) Monocular human motion capture with a mixture of regressors. In: IEEE international conference on computer vision, pp 123–130

  4. Jojic N, Turk M, Huang TS (1999) Tracking self-occluding articulated objects in dense disparity maps. Comput Soc Conf CVPR 1: 390–397

    Google Scholar 

  5. Lin MH (1999) Tracking articulated objects in real-time range image sequences. In: IEEE international conference on computer vision, pp 648–653

  6. Molina-Tanco L, Bandera JP, Marfil R, Sandoval R (2005) Real-time human motion analysis for human–robot interaction. In: IEEE/RSJ international conference on intelligent robots and systems, pp 1402–1407

  7. Delamarre Q, Faugeras O (1999) 3D articulated models and multi-view tracking with silhouettes. IEEE Int Conf Comput Vis 2: 716–721

    Google Scholar 

  8. Deutscher J, Blake A, Reid I (2000) Articulated body motion capture by annealed particle filtering. IEEE Conf CVPR 2: 126–133

    Google Scholar 

  9. Cheung KM, Baker S, Hodgins JK, Kanade T (2004) Markerless human motion transfer. In: International symposium on 3D data processing, visualization and transmission, pp 374–378

  10. Kim E-S, Kim JJ (2000) An automatic modeling method for fitting volumetric objects with metacubes. In: Eighth Pacific conference on computer graphics and applications, p 411

  11. Plankers R, Fua P (2003) Articulated soft objects for multiview shape and motion capture. IEEE Trans Pattern Anal Mach Intell 25(9): 1182–1187

    Article  Google Scholar 

  12. Kim JJ, Kim E-S, Park S-K (1997) An automatic description of volumetric objects using metaballs. In: Computer Graphics International, pp 65–73

  13. Modeslund TB, Granum E (2001) HA survey of computer vision-based human motion capture. Comput Vis Image Underst 81: 231–268

    Article  Google Scholar 

  14. Shi J, Tomasi C (1994) Good features to track. In: IEEE conference on computer vision and pattern recognition, pp 593–600

  15. Tomasi C, Kanade T (1991) Detection and tracking of point features. Carnegie Mellon University Technical Report CMU- CS-91–132

  16. Besl PJ, McKay ND (1992) A method for registration of 3-D Shapes. IEEE Trans Pattern Anal Mach Intell 14: 239–256

    Article  Google Scholar 

  17. Covell M, Rahimi A, Harville M, Darrell T (2000) Articulated-pose estimation using brightness and depth-constancy constraints. Conf Comput Vis Pattern Recognit 2: 438–445

    Google Scholar 

  18. Riley M, Ude A, Wade K, Atkeson CG (2003) Enabling real-time full-body imitation: a natural way of transferring human movement to humanoids. In: IEEE international conference on robotics and automation, pp 2368–2374

  19. Kim CH, Kim D, Oh Y (2005) Solving an inverse kinematics problem for a humanoid robots imitation of human motions using optimization. In: Second international conference on informatics in control, automation and robotics, pp 85–92

  20. Murray RM, Li Z, Sastry SS (1993) A mathe-matical introduction to robotic manipulation. CRC Press, Boca Raton

    Google Scholar 

  21. Deluca-Cardillo D, Huynh DQ, Bennamoun M (2007) 3D pose recovery of the human arm from a single view. In: Procedding of image and vision computing New Zealand, pp 46–51

  22. Knoop S, Vacekand S, Dillmann R (2006) Sensor fusion for 3D human body tracking with an articulated 3D body model. In: Conference on robotics and automation, pp 1686–1691

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sung-Kee Park.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Choi, Y., Ra, S., Kim, S. et al. Real-time arm motion imitation for human–robot tangible interface. Intel Serv Robotics 2, 61–69 (2009). https://doi.org/10.1007/s11370-009-0037-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11370-009-0037-8

Keywords

Navigation