Vision-Based Projected Tabletop Interface for Finger Interactions

  • Peng Song
  • Stefan Winkler
  • Syed Omer Gilani
  • ZhiYing Zhou
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4796)


We designed and implemented a vision-based projected tabletop interface for finger interaction. The system offers a simple and quick setup and economic design. The projection onto the tabletop provides more comfortable and direct viewing for users, and more natural, intuitive yet flexible interaction than classical or tangible interfaces. Homography calibration techniques are used to provide geometrically compensated projections on the tabletop. A robust finger tracking algorithm is proposed to enable accurate and efficient interactions using this interface. Two applications have been implemented based on this interface.


Projector-camera systems projector-based display interface design tabletop interface homography calibration finger tracking augmented reality bare-hand interface 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Sukthankar, R., Stockton, R., Mullin, M.: Smarter Presentations: Exploiting homography in camera-projector systems. In: Proc. International Conference on Computer Vision, Vancouver, Canada, pp. 247–253 (2001)Google Scholar
  2. 2.
    Chen, H., Wallace, G., Gupta, A., Li, K., Funkhouser, T., Cook, P.: Experiences with scalability of display walls. In: Proc. Immersive Projection Technology Symposium (IPT), Orlando, FL (2002)Google Scholar
  3. 3.
    Czernuszenko, M., Pape, D., Sandin, D., DeFanti, T., Dawe, L., Brown, M.: The ImmersaDesk and InfinityWall projection-based virtual reality displays. Computer Graphics, 46–49 (1997)Google Scholar
  4. 4.
    Buxton, W., Fitzmaurice, G., Balakrishnan, R., Kurtenbach, G.: Large displays in automotive design. IEEE Computer Graphics and Applications 20(4), 68–75 (2000)CrossRefGoogle Scholar
  5. 5.
    Song, P., Winkler, S., Tedjokusumo, J.: A Tangible Game Interface Using Projector-Camera Systems. In: HCI 2007. LNCS, vol. 4551, pp. 956–965. Springer, Heidelberg (2007)Google Scholar
  6. 6.
    Patten, J., Ishii, H., Pangaro, G.: Sensetable: A wireless object tracking platform for tangible user interfaces. In: Proc. CHI, Conference on Human Factors in Computing Systems, Seattle, Washington, USA (2001)Google Scholar
  7. 7.
    Mynatt, E.D., Igarashi, T., Edwards, W.K.: Flatland: New dimensions in office whiteboards. In: Proc. CHI 1999, Pittsburgh, PA, USA (1999)Google Scholar
  8. 8.
    Ashdown, M., Robinson, P.: Escritoire: A personal projected display. IEEE Multimedia Magazine 12(1), 34–42 (2005)CrossRefGoogle Scholar
  9. 9.
    Leigh, D., Dietz, P.: DiamondTouch characteristics and capabilities. In: UbiComp 2002 Workshop on Collaboration with Interactive Tables and Walls, Göteborg, Sweden (2002)Google Scholar
  10. 10.
    Rekimoto, J.: SmartSkin: An infrastructure for freehand manipulation on interactive surfaces. In: Proc. CHI 2002, Göteborg, Sweden, pp. 113–120 (2002)Google Scholar
  11. 11.
    Sukthankar, R., Stockton, R., Mullin, M.: Automatic keystone correction for camera-assisted presentation interfaces. In: Tan, T., Shi, Y., Gao, W. (eds.) ICMI 2000. LNCS, vol. 1948, pp. 607–614. Springer, Heidelberg (2000)CrossRefGoogle Scholar
  12. 12.
    Lien, C., Huang, C.: Model-based articulated hand motion tracking for gesture recognition. Image and Vision Computing 16(2), 121–134 (1998)CrossRefGoogle Scholar
  13. 13.
    Triesch, J., Malsburg, C.: Robust classification of hand postures against complex background. In: Proc. International Conference On Automatic Face and Gesture Recognition, Killington (1996)Google Scholar
  14. 14.
    Segen, J.: Gesture VR: Vision-based 3D hand interface for spatial interaction. In: Proc. ACM Multimedia Conference, Bristol, UK, ACM Press, New York (1998)Google Scholar
  15. 15.
    Rehg, J., Kanade, T.: Digiteyes: Vision-based 3D human hand tracking. In: Technical Report CMU-CS-93-220, School of Computer Science, Carnegie Mellon University (1993)Google Scholar
  16. 16.
    Sato, Y., Kobayashi, Y., Koike, H.: Fast tracking of hands and fingertips in infrared images for augmented desk interface. In: Proc. International Conference on Automatic Face and Gesture Recognition, Grenoble, France (2000)Google Scholar
  17. 17.
    Crowley, J., Bérard, F., Coutaz, J.: Finger tracking as an input device for augmented reality. In: Proc. International Conference on Automatic Face and Gesture Recognition, Zürich, Switzerland (1995)Google Scholar
  18. 18.
    Laptev, I., Lindeberg, T.: Tracking of multi-state hand models using particle filtering and a hierarchy of multi-scale image features. Technical Report ISRN KTH/NA/P-00/12-SE, The Royal Institute of Technology (KTH), Stockholm, Sweden (2000)Google Scholar
  19. 19.
    Hardenberg, C., Brard, F.: Bare-hand human computer interaction. In: Proc. Perceptual User Interfaces, Orlando, Florida, USA (2001)Google Scholar
  20. 20.
    Microsoft Corporation: Microsoft Windows XP Tablet PC Edition 2005 Recognizer Pack,

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Peng Song
    • 1
  • Stefan Winkler
    • 1
  • Syed Omer Gilani
    • 1
  • ZhiYing Zhou
    • 1
  1. 1.Interactive Multimedia Lab, Department of Electrical and Computer Engineering, National University of Singapore, 117576Singapore

Personalised recommendations