Advertisement

Haptic Interaction with Depth Video Media

  • Jongeun Cha
  • Seung-man Kim
  • Ian Oakley
  • Jeha Ryu
  • Kwan H. Lee
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3767)

Abstract

In this paper we propose a touch enabled video player system. A conventional video player only allows viewers to passively experience visual and audio media. In virtual environment, touch or haptic interaction has been shown to convey a powerful illusion of the tangible nature – the reality – of the displayed environments and we feel the same benefits may be conferred to a broadcast, viewing domain. To this end, this paper describes a system that uses a video representation based on depth images to add a haptic component to an audio-visual stream. We generate this stream through the combination of a regular RGB image and a synchronized depth image composed of per-pixel depth-from-camera information. The depth video, a unified stream of the color and depth images, can be synthesized from a computer graphics animation by rendering with commercial packages or captured from a real environment by using a active depth camera such as the Zcam TM . In order to provide a haptic representation of this data, we propose a modified proxy graph algorithm for depth video streams. The modified proxy graph algorithm can (i) detect collisions between a moving virtual proxy and time-varying video scenes, (ii) generates smooth touch sensation by handling the implications of the radically different display update rates required by visual (30Hz) and haptic systems (in the order of 1000Hz), (iii) avoid sudden change of contact forces. A sample experiment shows the effectiveness of the proposed system.

Keywords

Depth Image Collision Detection Virtual Object Haptic Interface Haptic Interaction 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    O’Modhrain, S., Oakley, I.: Touch TV: Adding Feeling to Broadcast Media. In: Proc. European Conf. Interactive Television: from Viewers to Actors, Brighton, UK, pp. 41–47 (2003)Google Scholar
  2. 2.
    Cha, J., Ryu, J., Kim, S., Eom, S., Ahn, B.: Haptic Interaction in Realistic Multimedia Broadcasting. In: Proc. 5th Pacific-Rim Conf. Multimedia on Advances in Multimedia Information Processing, Part III, November/December, pp. 482–490 (2004)Google Scholar
  3. 3.
    Ignatenko, A., Konushin, A.: A Framework for Depth Image-Based Modeling and Rendering. In: Proc. Graphicon 2003, September, pp. 169–172 (2003)Google Scholar
  4. 4.
    Kauff, P., Cooke, E., Fehn, C., Schreer, O.: Advanced Incomplete 3D Representation of Video Objects Using Trilinear Warping for Novel View Synthesis. In: Proc. PCS 2001, pp. 429–432 (2001)Google Scholar
  5. 5.
    Redert, A., Op de Beeck, M., Fehn, C., IJsselsteijn, W., Pollefeys, M., Van Gool, L., Ofek, E., Sexton, I., Surman, P.: ATTEST - Advanced Three-Dimensional Television System Technologies. In: Proc. 1st Int. Symp. 3D Data Processing, Visualization and Transmission, Padova, Italy, pp. 313–319 (2002)Google Scholar
  6. 6.
    Salisbury, K., Brock, D., Massie, T., Swarup, N., Zilles, C.: Haptic rendering: Programming touch interaction with virtual objects. In: Proc. 1995 ACM Symp. Interactive 3D Graphics, pp. 123–130 (1995)Google Scholar
  7. 7.
    Gottschalk, S., Lin, M., Manocha, D.: OBB-Tree: A hierarchical structure for rapid inter-ference detection. In: Proc. ACM SIGGRAPH 1996 (1996)Google Scholar
  8. 8.
    Inc. SensAble Technologies: GHOSTTM: Software developer’s toolkit, Programmer’s Guide (1997)Google Scholar
  9. 9.
    Ruspini, D., Kolarov, K., Khatib, O.: The haptic display of complex graphical environments. In: Proc. ACM SIGGRAPH 1997, pp. 345–352 (1997)Google Scholar
  10. 10.
    Walker, S., Salisbury, K.: Large Haptic Topographic Maps: MarsView and the Proxy Graph Algorithm. In: Proc. ACM SIGGRAPH 2003 Symposium on Interactive 3D Graphics, pp. 83–92 (2003)Google Scholar
  11. 11.
  12. 12.
    Zilles, C., Salisbury, K.: A Constraint Based God-Object Method For Haptic Display. In: Proc. IEE/RSJ Int. Conf. on Intelligent Robots and Systems, Human Robot Interaction, and Cooperative Robots, vol. 3, pp. 146–151 (1995)Google Scholar
  13. 13.
    Ho, C., Basdogan, C., Srinivasan, M.: Efficient point-based rendering techniques for haptic display of virtual objects. Presence: Teleoperators and Virtual Environments 8(5), 477–491 (1999)CrossRefGoogle Scholar
  14. 14.
    Ruspini, D., Khatib, O.: Dynamic Models for Haptic Rendering Systems. In: Proc. Advances in Robot Kinematics: ARK 1998, Strobl/Salzburg, Austria, pp. 523–532 (1998)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Jongeun Cha
    • 1
  • Seung-man Kim
    • 2
  • Ian Oakley
    • 1
  • Jeha Ryu
    • 1
  • Kwan H. Lee
    • 2
  1. 1.Human-Machine-Computer Interface Lab., Dept. of MechatronicsGwangju Institute of Science and TechnologyGwangjuRepublic of Korea
  2. 2.Dept. of MechatronicsIntelligent Design & Graphics Lab. 

Personalised recommendations