Advertisement

A Methodology to Introduce Gesture-Based Interaction into Existing Consumer Product

  • Lorenzo CavalieriEmail author
  • Maura Mengoni
  • Silvia Ceccacci
  • Michele Germani
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9732)

Abstract

The continuous progress of interaction technologies reveals that we are witnessing a revolution that is leading to a redefinition of the concept of “user interface” and to the development of new ways to interact with the electronic devices of all sizes and capabilities. Current trends in research related to the Human-Machine Interaction (HMI) show a considerable interest toward gesture, motion-based and full-body based interactions. In this context, a User-Centered Design (UCD) methodology to implement these novel interaction paradigms into consumer products is proposed with the aim to improve its usability, intuitiveness and experience. A case study is used to validate the methodology and measure the achieved improvements in user performance.

Keywords

Gesture interaction Design methods User interfaces User-Centered design 

Notes

Acknowledgements

This work is carried out with the collaboration of Korg Italy SPA, which has provided the actual product and your skills and knowledge to support the analysis phase.

References

  1. 1.
    O’hara, K., et al.: On the naturalness of touchless: putting the “interaction” back into NUI. ACM Trans. Comput. Hum. Interact. (TOCHI) 20(1), 5 (2013)MathSciNetGoogle Scholar
  2. 2.
    Hughes, D.: Technologized and autonomized vocals in contemporary popular musics. J. Music Technol. Educ. 8(2), 163–182 (2015)CrossRefGoogle Scholar
  3. 3.
    Mitchell, T.J.: Soundgrasp: a gestural interface for the performance of live music (2011)Google Scholar
  4. 4.
    Titanrealitycom: Titanrealitycom. Retrieved 11 February, 2016. https://titanreality.com
  5. 5.
    Ketabdar, H., et al.: MagiMusic: using embedded compass (magnetic) sensor for touch-less gesture based interaction with digital music instruments in mobile devices. In: Proceedings of the Fifth International Conference on Tangible, Embedded, and Embodied Interaction. ACM (2011)Google Scholar
  6. 6.
    Yuksel, K.A., Ketabdar, H., Roshandel, M.: Towards digital music performance for mobile devices based on magnetic interaction. In: 2010 IEEE International Symposium on Haptic Audio-Visual Environments and Games (HAVE). IEEE (2010)Google Scholar
  7. 7.
    van Dorp Skogstad, S.A., Jensenius, A.R., Nymoen, K.: Using IR optical marker based motion capture for exploring musical interaction, pp. 407–410 (2010)Google Scholar
  8. 8.
    Marrin, T.: Possibilities for the digital baton as a general-purpose gestural interface. In: CHI 1997 Extended Abstracts on Human Factors in Computing Systems. ACM (1997)Google Scholar
  9. 9.
    Cabral, M.C., Morimoto, C.H., Zuffo, M.K.: On the usability of gesture interfaces in virtual reality environments. In: Proceedings of the 2005 Latin American Conference on Human-Computer Interaction. ACM (2005)Google Scholar
  10. 10.
    Manresa-Yee, C., et al.: Hand tracking and gesture recognition for human-computer interaction. In: Progress in Computer Vision and Image Analysis, pp. 401–412 (2010)Google Scholar
  11. 11.
    Rico, J., Brewster, S.: Usable gestures for mobile interfaces: evaluating social acceptability. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM (2010)Google Scholar
  12. 12.
    Godøy, R.I., Leman, M. (eds.): Musical Gestures: Sound, Movement, and Meaning. Routledge, New york (2010)Google Scholar
  13. 13.
    Ng, K.C.: Music via motion: transdomain mapping of motion and sound for interactive performances. Proc. IEEE 92(4), 645–655 (2004)CrossRefGoogle Scholar
  14. 14.
    Vigliensoni, G., Wanderley, M.M.: A quantitative comparison of position trackers for the development of a touch-less musical interface. In: Proceedings of the 12th International Conference on New Interfaces for Musical Expression (NIME 2012), Vancouver, Canada (2012)Google Scholar
  15. 15.
    Morita, H., Hashimoto, S., Ohteru, S.: A computer music system that follows a human conductor. Computer 24(7), 44–53 (1991)CrossRefGoogle Scholar
  16. 16.
    Je, H., Kim, J., Kim, D.: Hand gesture recognition to understand musical conducting action. In: The 16th IEEE International Symposium on Robot and Human interactive Communication, 2007. RO-MAN 2007. IEEE (2007)Google Scholar
  17. 17.
    Nakra, T.M.: Inside the Conductor’s Jacket: Analysis, interpretation and musical synthesis of expressive gesture. Ph.D. thesis Massachusetts Institute of Technology (1999)Google Scholar
  18. 18.
    Finney, S.A.: Auditory feedback and musical keyboard performance. Music Percept. 15(2), 153–174 (1997)CrossRefGoogle Scholar
  19. 19.
    Pfordresher, P.Q., et al.: Brain responses to altered auditory feedback during musical keyboard production: an fMRI study. Brain Res. 1556, 28–37 (2014)CrossRefGoogle Scholar
  20. 20.
    Zamm, A., Pfordresher, P.Q., Palmer, C.: Temporal coordination in joint music performance: effects of endogenous rhythms and auditory feedback. Exp. Brain Res. 233(2), 607–615 (2015)CrossRefGoogle Scholar
  21. 21.
    Bigand, E., et al.: Looking into the eyes of a conductor performing lerdahl’s “Time after Time”. Musicae Sci. 14(2), 275–294 (2010)CrossRefGoogle Scholar
  22. 22.
    Wurtz, P., Mueri, R.M., Wiesendanger, M.: Sight-reading of violinists: eye movements anticipate the musical flow. Exp. Brain Res. 194(3), 445–450 (2009)CrossRefGoogle Scholar
  23. 23.
    Gilman, E., Underwood, G.: Restricting the field of view to investigate the perceptual spans of pianists. Vis. Cogn. 10(2), 201–232 (2003)CrossRefGoogle Scholar
  24. 24.
    Nielsen, J.: Usability inspection methods. In: Conference Companion on Human Factors in Computing Systems. ACM (1994)Google Scholar
  25. 25.
    Norman, D.A.: The Design of Everyday Things: Revised and Expanded Edition. Basic Books, London (2013)Google Scholar
  26. 26.
    Lewis, J.R.: IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use. Int. J. Hum. Comput. Interact. 7(1), 57–78 (1995)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Lorenzo Cavalieri
    • 1
    Email author
  • Maura Mengoni
    • 1
  • Silvia Ceccacci
    • 1
  • Michele Germani
    • 1
  1. 1.Department of Industrial Engineering and Mathematical SciencesUniversità Politecnica delle MarcheAnconaItaly

Personalised recommendations