Advertisement

Defining Gestural Interactions for Large Vertical Touch Displays

  • Robin Andersson
  • Jonas Berglund
  • Aykut Coşkun
  • Morten Fjeld
  • Mohammad Obaid
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10513)

Abstract

As new technologies emerge, so do new ways of interacting with the digital domain. In this paper, the touch interaction paradigm is challenged for use on large touch displays of 65 in. in size. We present a gesture elicitation study with 26 participants carried out on twelve actions commonly used on touch displays. The results and analysis of 312 touch gestures revealed agreement rates for each action. We report several findings including the results of a set of ten unique (and a few secondary) gestures, a taxonomy classifying the defined gestures, a pilot study on the defined gestures, and explicit design implications. We discuss the results and include several important factors for future considerations. We aim at helping future designers and engineers to design interactions for large touch displays.

Keywords

Large touch display User-defined Gestural interaction 

Notes

Acknowledgements

The authors would like to thank Oğuzhan Özcan, Asım Evren Yantac, Ayça Adviye Ünlüer and Doğa Çorlu for assisting in the user study. We thank Arçelik’s support in providing us with equipment to run the study. In addition, we thank Yolean for supplying the pilot study equipment and development expertise. Finally, we thank our participants both in Turkey and Sweden for their time and input.

References

  1. 1.
    Arefin Shimon, S.S., Lutton, C., Xu, Z., Morrison-Smith, S., Boucher, C., Ruiz, J.: Exploring non-touchscreen gestures for smartwatches. In: the 2016 Conference on Human Factors in Computing Systems, pp. 3822–3833. ACM (2016)Google Scholar
  2. 2.
    Chan, E., Seyed, T., Stuerzlinger, W., Yang, X.D., Maurer, F.: User elicitation on single-hand microgestures. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 3403–3414. ACM (2016)Google Scholar
  3. 3.
    Chen, W., Lao, S.Y., Lee, H., Smeaton, A.F.: Interaction design for multi-touch interactive walls. In: 2012 Second International Conference on Intelligent System Design and Engineering Application, pp. 310–313, January 2012Google Scholar
  4. 4.
    Collomb, M., Hascoët, M., Baudisch, P., Lee, B.: Improving drag-and-drop on wall-size displays. In: Proceedings of Graphics Interface 2005, GI 2005, pp. 25–32. Canadian Human-Computer Communications Society, School of Computer Science, University of Waterloo, Waterloo (2005)Google Scholar
  5. 5.
    Doeweling, S., Glaubitt, U.: Drop-and-drag: Easier drag & drop on large touchscreen displays. In: Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries, NordiCHI 2010, pp. 158–167. ACM, New York (2010). http://doi.acm.org/10.1145/1868914.1868936
  6. 6.
    Efron, D.: Gesture and Environment. King’s Crown Press, New York (1941)Google Scholar
  7. 7.
    Ekman, P., Friesen, W.V.: The repertoire of nonverbal behavior: Categories, origins, usage, and coding. Semiotica 1(1), 49–98 (1969)CrossRefGoogle Scholar
  8. 8.
    ELAN the language archive. https://tla.mpi.nl/tools/tla-tools/elan/. Accessed 21 Jan 2017
  9. 9.
    Gerken, J., Jetter, H.C., Schmidt, T., Reiterer, H.: Can “touch” get annoying?. In: Proceeding of the ACM International Conference on Interactive Tabletops and Surfaces, ITS 2010, pp. 257–258. ACM, New York (2010)Google Scholar
  10. 10.
    Hart, S.G.: NASA-Task Load Index (NASA-TLX); 20 years later. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 50, pp. 904–908. Sage Publications (2006)Google Scholar
  11. 11.
    Kobayashi, M., Igarashi, T.: Boomerang: Suspendable drag-and-drop interactions based on a throw-and-catch metaphor. In: Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology, UIST 2007, pp. 187–190. ACM, New York (2007)Google Scholar
  12. 12.
    Köpsel, A., Bubalo, N.: Benefiting from legacy bias. Interactions 22(5), 44–47 (2015)CrossRefGoogle Scholar
  13. 13.
    Kou, Y., Kow, Y.M., Cheng, K.: Developing intuitive gestures for spatial interaction with large public displays. In: Streitz, N., Markopoulos, P. (eds.) DAPI 2015. LNCS, vol. 9189, pp. 174–181. Springer, Cham (2015). doi: 10.1007/978-3-319-20804-6_16 CrossRefGoogle Scholar
  14. 14.
    Mateescu, M., Kropp, M., Burkhard, R., Zahn, C., Vischi, D.: aWall: A socio-cognitive tool for agile team collaboration using large multi-touch wall systems. In: Proceedings of the 2015 International Conference on Interactive Tabletops & Surfaces, ITS 2015, pp. 405–408. ACM, New York (2015)Google Scholar
  15. 15.
    Mauney, D., Howarth, J., Wirtanen, A., Capra, M.: Cultural similarities and differences in user-defined gestures for touchscreen user interfaces. In: CHI 2010 Extended Abstracts on Human Factors in Computing Systems, CHI EA 2010, pp. 4015–4020. ACM, New York (2010)Google Scholar
  16. 16.
    McNeill, D.: So you think gestures are nonverbal? Psychol. Rev. 92(3), 350 (1985)CrossRefGoogle Scholar
  17. 17.
    McNeill, D.: Hand and Mind: What Gestures Reveal about Thought. University of Chicago press, Chicago (1992)Google Scholar
  18. 18.
    Migge, B., Kunz, A.: Interaction error based viewpoint estimation for continuous parallax error correction on interactive screens. In: Proceedings of the IADIS International Conference: ICT, Society and Human Beings (2011)Google Scholar
  19. 19.
    Morris, M.R.: Web on the wall: Insights from a multimodal interaction elicitation study. In: Proceedings of the 2012 ACM International Conference on Interactive Tabletops and Surfaces, ITS 2012, pp. 95–104. ACM, New York (2012)Google Scholar
  20. 20.
    Morris, M.R., Danielescu, A., Drucker, S., Fisher, D., Lee, B., Schraefel, M.C., Wobbrock, J.O.: Reducing legacy bias in gesture elicitation studies. Interactions 21(3), 40–45 (2014)CrossRefGoogle Scholar
  21. 21.
    Nielsen, M., Störring, M., Moeslund, T.B., Granum, E.: A procedure for developing intuitive and ergonomic gesture interfaces for HCI. In: Camurri, A., Volpe, G. (eds.) GW 2003. LNCS, vol. 2915, pp. 409–420. Springer, Heidelberg (2004). doi: 10.1007/978-3-540-24598-8_38 CrossRefGoogle Scholar
  22. 22.
    Nolte, A., Brown, R., Poppe, E., Anslow, C.: Towards collaborative modelling of business processes on large interactive touch display walls. In: Proceedings of the 2015 International Conference on Interactive Tabletops & Surfaces, ITS 2015, pp. 379–384. ACM, New York (2015)Google Scholar
  23. 23.
    Obaid, M., Kistler, F., Häring, M., Bühling, R., André, E.: A framework for user-defined body gestures to control a humanoid robot. Int. J. Soc. Robot. 6(3), 383–396 (2014)CrossRefGoogle Scholar
  24. 24.
    Obaid, M., Kistler, F., Kasparavičiūtė, G., Yantaç, A.E., Fjeld, M.: How would you gesture navigate a drone?: a user-centered approach to control a drone. In: Proceedings of the 20th International Academic Mindtrek Conference, pp. 113–121. ACM (2016)Google Scholar
  25. 25.
    Ruiz, J., Li, Y., Lank, E.: User-defined motion gestures for mobile interaction. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2011. ACM, New York (2011)Google Scholar
  26. 26.
    Ryall, K., Forlines, C., Shen, C., Morris, M.R., Everitt, K.: Experiences with and observations of direct-touch tabletops. In: Proceedings of the First IEEE International Workshop on Horizontal Interactive Human-Computer Systems, TABLETOP 2006, pp. 89–96. IEEE Computer Society, Washington, DC (2006)Google Scholar
  27. 27.
    Saffer, D.: Designing Gestural Interfaces: Touchscreens and Interactive Devices. O’Reilly Media, Inc., Sebastopol (2008)Google Scholar
  28. 28.
    Vatavu, R.D.: User-defined gestures for free-hand TV control. In: Proceedings of the 10th European Conference on Interactive TV and Video, EuroITV 2012, pp. 45–48. ACM, New York (2012)Google Scholar
  29. 29.
    Vatavu, R.D., Wobbrock, J.O.: Formalizing agreement analysis for elicitation studies: New measures, significance test, and toolkit. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI 2015, pp. 1325–1334. ACM, New York (2015)Google Scholar
  30. 30.
    Wittorf, M.L., Jakobsen, M.R.: Eliciting mid-air gestures for wall-display interaction. In: Proceedings of the 9th Nordic Conference on Human-Computer Interaction, NordiCHI 2016, pp. 3:1–3:4. ACM, New York (2016)Google Scholar
  31. 31.
    Wobbrock, J.O., Aung, H.H., Rothrock, B., Myers, B.A.: Maximizing the guessability of symbolic input. In: CHI 2005 Extended Abstracts on Human Factors in Computing Systems, pp. 1869–1872. ACM, New York (2005)Google Scholar
  32. 32.
    Wobbrock, J.O., Morris, M.R., Wilson, A.D.: User-defined gestures for surface computing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2009, pp. 1083–1092. ACM, New York (2009)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2017

Authors and Affiliations

  • Robin Andersson
    • 1
  • Jonas Berglund
    • 1
  • Aykut Coşkun
    • 2
  • Morten Fjeld
    • 1
  • Mohammad Obaid
    • 3
  1. 1.Computer Science and EngineeringChalmers University of TechnologyGothenburgSweden
  2. 2.KUAR, Media and Visual Arts DepartmentKoç UniversityIstanbulTurkey
  3. 3.Department of Information TechnologyUppsala UniversityUppsalaSweden

Personalised recommendations