Force Feedback Virtual Painting on Real Objects: A Paradigm of Augmented Reality Haptics

  • Benjamin Bayart
  • Jean-Yves Didier
  • Abderrahmane Kheddar
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5024)


This paper presents a paradigm of augmented reality haptic through an application enabling the interaction on real objects with a virtual tool. In order to interact within the real world, a real haptic probe is used so that the user feels the interaction. Furthermore, through the use of a visual partial reality removal process and a camera placed on the real scene, the real tool is visually hidden in the visual feedback and replaced by the virtual tool. Since, the real and virtual probes do not necessarily match, a model of the virtual tool is used to adjust and tune the haptic feedback, while at the same time the virtual tool is visually rendered according to the real measured forces by the haptic probe. Eventually, proposing a mixed painting application, the painting, applied on the real object, i.e. when the user comes in contact with this latter, is visually displayed such that its form is computed from the virtual tool geometry while its size and intensity from the real measured forces.


Augmented Reality Haptics Visual Partial Removal Reality Mixed Haptic Interaction Mixed Painting Application 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Bandyopadhyay, D., Raskar, R., Fuchs, H.: Dynamic Shader Lamps: Painting on Movable Objects. In: International Symposium on Augmented Reality (ISAR 2001), p. 207 (2001)Google Scholar
  2. 2.
    Bayart, B., Drif, A., Kheddar, A., Didier, J.-Y.: Visuo-Haptic Blending Applied to a Tele-Touch-Diagnosis Application. In: The proceedings of the 12th International Conference on Human-Computer Interaction 2007, vol. 14, pp. 617–626 (2007)Google Scholar
  3. 3.
    Baxter, W.V., Scheib, V., Lin, M.C., Manocha, D.: dAb: Interactive Haptic Painting With 3D Virtual Brushes. In: SIGGRAPH 2001 Computer Graphics Proceedings, pp. 461–468 (2001)Google Scholar
  4. 4.
    Buchmann, V., Nilsen, T., Billinghurst, M.: Interaction with partially transparent hands and objects. In: AUIC 2005: Proceedings of the Sixth Australasian conference on User interface, pp. 17–20 (2005)Google Scholar
  5. 5.
    Colgate, J.E., Stanley, M.C., Brown, J.M.: Issues in the Haptic Display of Tool Use. In: Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), vol. 3, pp. 140–145 (1995)Google Scholar
  6. 6.
    Drif, A., Citrin, J., Kheddar, A.: A multilevel haptic display design. In: IEEE/RSJ International Conference on Robot and Intelligent Systems (2004)Google Scholar
  7. 7.
    Grasset, R., Gascuel, J.-D., Schmalstieg, D.: Interactive Mediated Reality. In: International Symposium on Augmented Reality(ISMAR) 2003 (2003)Google Scholar
  8. 8.
    Ilie, A., Raskar, R., Yu, J.: Gradient Domain Context Enhancement for Fixed Cameras. IEEE Transactions on Pattern Analysis and Machine Intelligence 19(4), 533–549 (2005)Google Scholar
  9. 9.
    Mourgues, F., Devernay, F., Coste-Maniere, E.: 3D reconstruction of the operating field for image overlay in 3D-endoscopic surgery. In: The Proceedings of the IEEE and ACM International Symposium on Augmented Reality, pp. 191–192 (2001)Google Scholar
  10. 10.
    Yokokohji, Y., Hollis, R.L., Kanade, T.: What you can see is what you can feel. -Development of a visual/haptic interface to virtual environment. In: IEEE Virtual Reality Annual International Symposium, pp. 46–53 (1996)Google Scholar
  11. 11.

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Benjamin Bayart
    • 1
  • Jean-Yves Didier
    • 1
  • Abderrahmane Kheddar
    • 1
  1. 1.IBISC Laboratory, Évry, FranceCentre National de la Recherche Scientifique (CNRS)ParisFrance

Personalised recommendations