Interacting with Augmented Floor Surfaces

  • Yon Visell
  • Severin Smith
  • Jeremy R. Cooperstock


This chapter reviews techniques and technologies for interaction via the feet with touch-sensitive floor surfaces that are augmented with multimodal (visual, auditory, and/or haptic) feedback. We discuss aspects of human-computer interaction with such interfaces, including potential applications in virtual and augmented reality for floor based user interfaces and immersive walking simulations. Several realizations of augmented floor surfaces are discussed, and we review one case example that has been extensively investigated by the authors, along with evaluations that have been reported in prior literature. Potential applications in the domains of human-computer interaction and virtual reality are also reviewed.


Augmented reality Virtual reality Human-computer interaction Locomotion Touch screen displays 


  1. 1.
    Addlesee MD, Jones AH, Livesey F, Samaria FS (1997) The ORL active floor. IEEE Pers Commun 4(5):35–51CrossRefGoogle Scholar
  2. 2.
    Agrawal P, Rauschert I, Inochanon K, Bolelli L, Fuhrmann S, Brewer I, Cai G, MacEachren A, Sharma R (2004) Multimodal interface platform for geographical information systems (GEOMIP) in crisis management. In: ICMI ’04: Proceedings of the 6th international conference on multimodal interfaces. ACM, New York, NY, USA, pp 339–340Google Scholar
  3. 3.
    Albinsson PA, Zhai S (2003) High precision touch screen interaction. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 105–112Google Scholar
  4. 4.
    Augsten T, Kaefer K, Meusel R, Fetzer C, Kanitz D, Stoff T, Becker T, Holz C, Baudisch P (2010) Multitoe: high-precision interaction with back-projected floors based on high-resolution multi-touch input. In: Proceedings of the 23nd annual ACM symposium on user interface software and technology. ACM, pp 209–218Google Scholar
  5. 5.
    Begg RK, Palaniswami M, Owen B (2005) Support vector machines for automated gait classification. IEEE Trans Biomed Eng 52(5):828–838Google Scholar
  6. 6.
    Benko H, Wilson AD, Baudisch P (2006) Precise selection techniques for multi-touch screens. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 1263–1272Google Scholar
  7. 7.
    Bicchi A, Salisbury JK, Brock DL (1993) Contact sensing from force measurements. Int J of Robot Res 12(3):249CrossRefGoogle Scholar
  8. 8.
    Bruder G, Steinicke F, Hinrichs KH (2009) Arch-explore: a natural user interface for immersive architectural walkthroughs. In: Proceedings of IEEE symposium on 3D user interfaces, 3DUI 2009. IEEE, pp 75–82Google Scholar
  9. 9.
    Chang S, Ham S, Kim S, Suh D, Kim H (2010) Ubi-floor: design and pilot implementation of an interactive floor system. In: Proceedings of 2nd international conference on intelligent human-machine systems and cybernetics (IHMSC), 2010, vol 2. IEEE, pp 290–293Google Scholar
  10. 10.
    Fassbender E, Richards D (2008) Using a dance pad to navigate through the virtual heritage environment of macquarie lighthouse, Sydney. In: Virtual systems and multimedia, Springer, pp 1–12Google Scholar
  11. 11.
    Fernström M, Griffith N (1998) Litefoot-auditory display of footwork. In: Proceeding of ICAD, vol 98Google Scholar
  12. 12.
    Gronboek K, Iversen OS, Kortbek KJ, Nielsen KR, Aagaard L (2007) Interactive floor support for kinesthetic interaction in children learning environments. Lect Notes Comput Sci 4663:361CrossRefGoogle Scholar
  13. 13.
    Headon R, Curwen R (2001) Recognizing movements from the ground reaction force. In: Proceedings of the workshop on perceptive user, interfacesGoogle Scholar
  14. 14.
    Headon R, Curwen R (2002) Movement awareness for ubiquitous game control. Pers Ubiquitous Comput 6(5):407–415CrossRefGoogle Scholar
  15. 15.
    Hoffmann ER (1991) A comparison of hand and foot movement times. Ergonomics 34(4):397Google Scholar
  16. 16.
    Hollerbach J (2008) Locomotion interfaces and rendering. In: Lin M, Otaduy M (eds) Haptic rendering: foundations, algorithms and applications. A K Peters, Ltd., 2008Google Scholar
  17. 17.
    Holzreiter SH, Köhle ME (1993) Assessment of gait patterns using neural networks. J Biomech 26(6):645–651CrossRefGoogle Scholar
  18. 18.
    Jacko JA, Sears A (2003) The human-computer interaction handbook: fundamentals, evolving technologies, and emerging applications. Lawrence Erlbaum Assoc Inc, MahwahGoogle Scholar
  19. 19.
    LaViola JJ Jr, Feliz DA, Keefe DF, Zeleznik RC (2001) Hands-free multi-scale navigation in virtual environments. In: Proceedings of the 2001 symposium on interactive 3D graphics. ACM, New York, NY, USA, pp 9–15Google Scholar
  20. 20.
    MacEachren AM, Cai G, Sharma R, Rauschert I, Brewer I, Bolelli L, Shaparenko B, Fuhrmann S, Wang H (2005) Enabling collaborative geoinformation access and decision-making through a natural, multimodal interface. Int J Geogr Inf Sci 19(3):293–317CrossRefGoogle Scholar
  21. 21.
    MacKenzie IS (1992) Fitts’ law as a research and design tool in human-computer interaction. Hum Comput Interact 7(1):91–139Google Scholar
  22. 22.
    Mostayed A, Mynuddin M, Mazumder G, Kim S, Park S (2008) Abnormal gait detection using Discrete Fourier Transform. In: MUE, pp 36–40Google Scholar
  23. 23.
    Pakkanen T, Raisamo R (2004) Appropriateness of foot interaction for non-accurate spatial tasks. In: CHI ’04: CHI ’04 extended abstracts on human factors in computing systems. ACM, New York, NY, USA, pp 1123–1126Google Scholar
  24. 24.
    Paradiso J, Abler C, Hsiao KY, Reynolds M (1997) The magic carpet: physical sensing for immersive environments. In: ACM CHI extended abstracts. ACM, New York, NY, USA, pp 277–278Google Scholar
  25. 25.
    Pearson G, Weiser M (1986) Of moles and men: the design of foot controls for workstations. In: Proceedings of the ACM SIGCHI conference on human factors in computing systems. ACM, New York, NY, USA, pp 333–339Google Scholar
  26. 26.
    Perry J (1992) Gait analysis: normal and pathological function. SLACK Inc., ThorofareGoogle Scholar
  27. 27.
    Potter RL, Weldon LJ, Shneiderman B (1988) Improving the accuracy of touch screens: an experimental evaluation of three strategies. In: CHI ’88: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, New York, NY, USA, pp 27–32Google Scholar
  28. 28.
    Rangarajan S, Kidane A, Qian G, Rajko S, Birchfield D (2007) The design of a pressure sensing floor for movement-based human computer interaction. Smart Sensing and, Context, pp 46–61Google Scholar
  29. 29.
    Rauschert I, Agrawal P, Sharma R, Fuhrmann S, Brewer I, MacEachren A (2002) Designing a human-centered, multimodal gis interface to support emergency management. In: GIS ’02: Proceedings of the 10th ACM international symposium on advances in geographic information systems. ACM, New York, NY, USA, pp 119–124Google Scholar
  30. 30.
    Razzaque S, Kohn Z, Whitton MC (2001) Redirected walking. In: Proceedings of EUROGRAPHICS, pp 289–294Google Scholar
  31. 31.
    Richardson B, Leydon K, Fernstrom M, Paradiso JA (2004) Z-tiles: building blocks for modular, pressure-sensing floorspaces. In: CHI’04 extended abstracts on Human factors in computing systems. ACM, pp 1529–1532Google Scholar
  32. 32.
    Schmidt A, Strohbach M, van Laerhoven K, Friday A, Gellersen H-W (2002) Context acquisition based on load sensing. In: UbiComp ’02: Proceedings of the 4th international conference on Ubiquitous computing. Springer-Verlag, London, UK, pp 333–350Google Scholar
  33. 33.
    Sinclair J, Hingston P, Masek M (2007) Considerations for the design of exergames. In: Proceedings of the 5th international ACM conference on computer graphics and interactive techniques, Southeast Asia, p 295Google Scholar
  34. 34.
    Templeman JN, Denbrook PS, Sibert LE (1999) Virtual locomotion: walking in place through virtual environments. Presence 8(6):598–617CrossRefGoogle Scholar
  35. 35.
    Valkov D, Steinicke F, Bruder G, Hinrichs KH (2010) Traveling in 3d virtual environments with foot gestures and a multi-touch enabled WIM. In: Proceedings of virtual reality international conference (VRIC 2010), pp 171–180Google Scholar
  36. 36.
    Visell Y, Cooperstock JR (2010) Design of a vibrotactile display via a rigid surface. In: Proceedings of IEEE haptics symposium, 2010Google Scholar
  37. 37.
    Visell Y, Cooperstock J, Giordano BL, Franinovic K, Law A, McAdams S, Jathal K, Fontana F (2008) A vibrotactile device for display of virtual ground materials in walking. In: Proceedings of eurohaptics symposium 2008Google Scholar
  38. 38.
    Visell Y, Fontana F, Giordano BL, Nordahl R, Serafin S, Bresin R (2009) Sound design and perception in walking interactions. Int J Hum Comput Stud 67:(11)947–959Google Scholar
  39. 39.
    Visell Y, Smith S, Law A, Rajalingham R, Cooperstock JR (2010) Contact sensing and interaction techniques for a distributed, multimodal floor display. In: IEEE symposium on 3D user interfaces (3DUI), 2010. IEEE, pp 75–78Google Scholar
  40. 40.
    Visell Y, Law A, Ip J, Smith S, Cooperstock JR (2010) Interaction capture in immersive virtual environments via an intelligent floor surface. In: Proceedings of IEEE virtual reality 2010Google Scholar
  41. 41.
    Wauben LSGL, van Veelen MA, Gossot D, Goossens RHM (2006) Application of ergonomic guidelines during minimally invasive surgery: a questionnaire survey of 284 surgeons. Surg Endoscopy 20(8):1268–1274CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  • Yon Visell
    • 1
  • Severin Smith
    • 2
  • Jeremy R. Cooperstock
    • 2
  1. 1.Department of Electrical Engineering and Computer EngineeringDrexel UniversityPhiladelphiaUSA
  2. 2.McGill UniversityMontrealCanada

Personalised recommendations