Combining Pointing Gestures with Video Avatars for Remote Collaboration

  • Seon-Min Rhee
  • Myoung-Hee Kim
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4555)


We present a simple and intuitive method of user interaction, based on pointing gestures, which can be used with video avatars in a remote collaboration. By connecting the head and fingertip of a user in 3D space we can identify the direction in which they are pointing. Stereo infrared cameras in front of the user, together with an overhead camera, are used to find the user’s head and fingertip in a CAVETM-like system. The position of the head is taken to be the top of the user’s silhouette, while the location of the user’s fingertip is found directly in 3D space by searching the images from the stereo cameras for a match with its location in the overhead camera image in real time. The user can interact with the first object which collides with the pointing ray. In an experimental result, the result of the interaction is shown together with the video avatar which is visible to a remote collaborator.


gesture interaction immersive display human-computer interaction 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Rhee, S.-M., Ziegler, R., Park, J., Naef, M., Gross, M., Kim, M.-H.: Low-Cost Telepresence in Collaborative Virtual Environment. IEEE Transaction on Visualization and Computer Graphics 13(1), 156–166 (2007)CrossRefGoogle Scholar
  2. 2.
    Cheung, G.K.M., Baker, S., Kanade, T.: Shape-From-Silhouette of Articulated Objects and Its Use for Human Body Kinematics Estimation and Motion Capture. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 77–84 (2003)Google Scholar
  3. 3.
    Mikic, I., Trivedi, M., Hunter, E., Cosman, P.: Human Body Model Acquisition and Tracking Using Voxel Data. International Journal of Computer Vision 53(3), 199–223 (2003)CrossRefGoogle Scholar
  4. 4.
    Tollmar, K., Demirdijan, D., Darrell, T.: Gesture and Play - Exploring Full-Body Navigation for Virtual Environments. In: Proceedings of the Conference on Computer Vision and Pattern Recognition Workshop, vol. 5(47) (2003)Google Scholar
  5. 5.
    Kahn, R.E., Swain, M.J.: Understanding People Pointing: The Perseus System, International Symposium on Computer Vision. A Motion III, 11 (1995)Google Scholar
  6. 6.
    Nickel, K., Stiefelhagen, R.: Real-time Recognition of 3D-Pointing Gestures for Human-Machine-Interaction. In: Michaelis, B., Krell, G. (eds.) Pattern Recognition. LNCS, vol. 2781, pp. 557–565. Springer, Heidelberg (2003)Google Scholar
  7. 7.
    Kolesnik, M., Kulessa, T.: Detecting, Tracking and Interpretation of a Pointing Gesture by an Overhead View Camera. In: Proceedings of the Annual Symposium of the German Association for Pattern Recognition, pp. 429–436 (2001)Google Scholar
  8. 8.
    Tsai, R.Y.: An Efficient and Accurate Camera Calibration Technique for 3D Machine Vision. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 364–374 (1986)Google Scholar
  9. 9.
    Taubin, G.: Camera Model and Triangulation, Note for EE-148: 3D Photography, CalTech (2001)Google Scholar
  10. 10.
    Naef, M., Staadt, O., Gross, M.: Blue-C API: A Multimedia and 3D Video Enhanced Toolkit for Collaborative VR and Telepresence. In: Proceedings of ACM SIGGRAPH International Conference on Virtual Reality Continuum and Its Applications in Industry, pp. 11–18 (2004)Google Scholar
  11. 11.
    FlyCapture® Software Development Kit,
  12. 12.
    Open Computer Vision Library,

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Seon-Min Rhee
    • 1
  • Myoung-Hee Kim
    • 1
    • 2
  1. 1.Department of Computer Science and Engineering, Ewha Womans University, 11-1, daehyun-dong Seodaemun-gu, Seoul, 120-750Korea
  2. 2.Center for Computer Graphics and Engineering(CCGVR), Ewha Womans University, 11-1 daehyun-dong Seodaemun-gu, Seoul, 120-750Korea

Personalised recommendations