We’re sorry, something doesn't seem to be working properly.

Please try refreshing the page. If that doesn't work, please contact support so we can address the problem.

Advertisement

Touch detection for planar interactive displays based on lateral depth views

  • 173 Accesses

  • 1 Citations

Abstract

This work regards fingertip contact detection and localization upon planar surfaces, for the purpose of providing interactivity in augmented, interactive displays that are implemented upon these surfaces. The proposed approach differs from the widely employed approach where user hands are observed from above, in that user hands are imaged laterally. An algorithmic approach for the treatment of the corresponding visual input is proposed. The proposed approach is extensively evaluated and compared to the top view approach. Advantages of the proposed approach include increased sensitivity, localization accuracy, scalability, as well as, practicality and cost efficiency of installation.

This is a preview of subscription content, log in to check access.

Access options

Buy single article

Instant unlimited access to the full article PDF.

US$ 39.95

Price includes VAT for USA

Subscribe to journal

Immediate online access to all issues from 2019. Subscription will auto renew annually.

US$ 199

This is the net price. Taxes to be calculated in checkout.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

References

  1. 1.

    Agarwal A, Izadi S, Chandraker M, Blake A (2007) High precision multi-touch sensing on surfaces using overhead cameras. In: IEEE international workshop on horizontal interactive human-computer systems, pp 197–200

  2. 2.

    Benko H, Jota R, Wilson A (2012) Miragetable: freehand interaction on a projected augmented reality tabletop. In: SIGCHI conference on human factors in computing systems, pp 199–208

  3. 3.

    Bhalla M, Bhalla A (2010) Article: comparative study of various touchscreen technologies. Int J Comput Appl 6(8):12–18

  4. 4.

    Bimber O, Raskar R (2005) Spatial augmented reality: merging real and virtual worlds. A. K. Peters, Ltd., Natick

  5. 5.

    Bishop CM (2006) Pattern recognition and machine learning. Springer

  6. 6.

    Dietz P, Leigh D (2001) Diamondtouch: A multi-user touch technology. In: ACM symposium on user interface software and technology, pp 219–226

  7. 7.

    Fischler M, Bolles R (1981) Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun ACM 24(6):381–395

  8. 8.

    Gesture Works http://gestureworks.com/

  9. 9.

    Han J (2005) Low-cost multi-touch sensing through frustrated total internal reflection. In: ACM symposium on user interface software and technology, pp 115–118

  10. 10.

    von Hardenberg C, Berard F (2001) Bare-hand human-computer interaction. In: Workshop on perceptive user interfaces. ACM, New York, NY, USA, pp 1–8

  11. 11.

    Harrison C, Benko H, Wilson A (2011) Omnitouch: wearable multitouch interaction everywhere. In: ACM symposium on user interface software and technology, pp 441–450

  12. 12.

    Hartmann G, Wunsche B (2012) A virtual touchscreen with depth recognition. In: Australasian user interface conference, pp 39–48

  13. 13.

    Hilliges O, Kim D, Izadi S, Weiss M, Wilson A (2012) Holodesk: direct 3D interactions with a situated see-through display. In: Human factors in computing systems, pp 2421–2430

  14. 14.

    Jones B, Sodhi R, Campbell R, Garnett G, Bailey B (2010) Build your world and play in it: interacting with surface particles on complex objects. In: IEEE international symposium on mixed and augmented reality, pp 165–174

  15. 15.

    Jones B, Sodhi R, Murdock M, Mehra R, Benko H, Wilson A, Ofek E, MacIntyre B, Raghuvanshi N, Shapira L (2014) RooMalive: magical experiences enabled by scalable, adaptive projector-camera units. In: ACM symposium on user interface software and technology, pp 637–644

  16. 16.

    Katz I, Gabayan K, Aghajan H (2007) A multi-touch surface using multiple cameras. Springer, pp 97–108

  17. 17.

    Kim J, Park J, Kim H, Lee C (2007) HCI (human computer interaction) using multi-touch tabletop display. In: IEEE pacific rim conference on communications, computers and signal processing, pp 391–394

  18. 18.

    Kjeldsen R, Pinhanez C, Pingali G, Hartman J, Levas T, Podlaseck M (2002) Interacting with steerable projected displays. In: Automatic face and gesture recognition, pp 402–410

  19. 19.

    Klompmaker F, Fischer H, Jung H (2012) Authenticated tangible interaction using RFID and depth-sensing cameras. In: International conference on advances in computer-human interactions, pp 141–144

  20. 20.

    Klompmaker F, Nebe K, Fast A (2012) dSensingNI: a framework for advanced tangible interaction using a depth camera. In: International conference on tangible, embedded and embodied interaction, pp 217–224

  21. 21.

    Koutlemanis P, Ntelidakis A, Zabulis X, Grammenos D, Adami I (2013) A steerable multitouch display for surface computing and its evaluation. Int J Artif Intell Tools 22(06):13600,161

  22. 22.

    Leibe B, Starner T, Ribarsky W, Wartell Z, Krum D, Weeks J, Singletary B, Hodges L (2000) Toward spontaneous interaction with the perceptive workbench. IEEE Comput Graph Appl 20(6):54–65

  23. 23.

    Margetis G, Zabulis X, Ntoa S, Koutlemanis P, Papadaki E, Antona M, Stephanidis C (2014) Enhancing education through natural interaction with physical paper. Univ Access Inf Soc:1–21

  24. 24.

    Matsushita N, Rekimoto J (1997) Holowall: designing a finger, hand, body, and object sensitive wall. In: ACM symposium on user interface software and technology, pp 209–210

  25. 25.

    Michel D, Argyros AA, Grammenos D, Zabulis X, Sarmis T (2009) Building a multi-touch display based on computer vision techniques. In: IAPR conference on machine vision applications, pp 74–77

  26. 26.

    Nocedal J, Wright SJ (2006) Numerical optimization, 2nd edn. Springer, New York

  27. 27.

    Ntelidakis A, Zabulis X, Grammenos D, Koutlemanis P (2015) Lateral touch detection and localization for interactive, augmented planar surfaces. In: International symposium on visual computing

  28. 28.

    Oikonomidis I, Kyriazis N, Argyros A (2011) Efficient model-based 3d tracking of hand articulations using Kinect. In: British machine vision conference, pp 101.1–101.11

  29. 29.

    Oikonomidis I, Kyriazis N, Argyros A (2011) Efficient model-based 3d tracking of hand articulations using kinect. In: British machine vision conference (BMVC 2011), vol 1. BMVA, Dundee, UK, pp 1–11

  30. 30.

    Rakkolainen I, Palovuori K (2005) Laser scanning for the interactive walk-through fogScreen. In: ACM symposium on virtual reality software and technology, pp 224–226

  31. 31.

    Rekimoto J (2002) Smartskin: an infrastructure for freehand manipulation on interactive surfaces. In: SIGCHI conference on human factors in computing systems, pp 113–120

  32. 32.

    Saponas S, Harrison C, Benko H (2011) Pockettouch: Through-fabric capacitive touch input. ACM, New York, NY, USA

  33. 33.

    Schoning J, Brandl P, Daiber F, Echtler F, Hilliges O, Hook J, Lochtefeld M, Motamedi N, Muller L, Olivier P, Roth T, von Zadow U (2008) Multi-touch surfaces: a technical guide. Tech rep

  34. 34.

    Smisek J, Jancosek M, Pajdla T (2011) 3D with kinect. In: IEEE international conference on computer vision workshops, pp 1154–1160

  35. 35.

    Song P, Winkler S, Gilani S, Zhou Z (2007) Vision-based projected tabletop interface for finger interactions. In: ICCV, lecture notes in computer science, vol 4796. Springer, pp 49–58

  36. 36.

    Streitz N, Tandler P, Müller-Tomfelde C, Konomi S (2001) Roomware: towards the next generation of human-computer interaction based on an integrated design of real and virtual worlds. Human-computer interaction in the New Millenium, Addison Wesley, pp 551–576

  37. 37.

    Takeoka Y, Miyaki T, Rekimoto J (2010) Z-touch: an infrastructure for 3d gesture interaction in the proximity of tabletop surfaces. In: ACM international conference on interactive tabletops and surfaces. ACM, New York, NY, USA, pp 91–94

  38. 38.

    Walker G (2011) Camera-based optical touch technology. Information Display 3:30–34

  39. 39.

    Wilson A (2005) Playanywhere: a compact interactive tabletop projection-vision system. In: ACM symposium on user interface software and technology, New York, NY, USA, pp 83–92

  40. 40.

    Wilson A (2010) Using a depth camera as a touch sensor. In: ACM international conference on interactive tabletops and surfaces, New York, NY, USA, pp 69–72

  41. 41.

    Wilson A, Benko H (2010) Combining multiple depth cameras and projectors for interactions on, above and between surfaces. In: ACM symposium on user interface software and technology, pp 273–282

  42. 42.

    Xiao R, Harrison C, Hudson S (2013) Worldkit: rapid and easy creation of ad-hoc interactive applications on everyday surfaces. In: Human factors in computing systems, pp 879–888

  43. 43.

    Zabulis X, Baltzakis H, Argyros A (2010) Vision-based hand gesture recognition for human-computer interaction. In: Stephanidis C (ed) The universal access handbook, chap 34. Lawrence Erlbaum Associates, Inc, pp 34.1–34.30

  44. 44.

    Zabulis X, Koutlemanis P, Grammenos D (2012) Augmented multitouch interaction upon a 2-DOF rotating disk. In: International symposium on visual computing, pp 642–653

Download references

Acknowledgments

This work has been supported by the FORTH-ICS internal RTD Programme “Ambient Intelligence and Smart Environments”.

Author information

Correspondence to Antonios Ntelidakis.

Electronic supplementary material

Below is the link to the electronic supplementary material.

(MP4 51.6 MB)

(MP4 3.15 MB)

(MP4 16.4 MB )

(MP4 51.6 MB)

(MP4 3.15 MB)

(MP4 16.4 MB )

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Ntelidakis, A., Zabulis, X., Grammenos, D. et al. Touch detection for planar interactive displays based on lateral depth views. Multimed Tools Appl 76, 12683–12707 (2017). https://doi.org/10.1007/s11042-016-3695-5

Download citation

Keywords

  • Human computer interaction
  • Spatial augmented reality
  • Interactive surface
  • Touch detection
  • Depth camera