Multimedia Systems

, Volume 18, Issue 1, pp 15–31 | Cite as

User-centred process for the definition of free-hand gestures applied to controlling music playback

  • Andreas Löcken
  • Tobias Hesselmann
  • Martin Pielot
  • Niels Henze
  • Susanne Boll
Regular Paper

Abstract

Music is a fundamental part of most cultures. Controlling music playback has commonly been used to demonstrate new interaction techniques and algorithms. In particular, controlling music playback has been used to demonstrate and evaluate gesture recognition algorithms. Previous work, however, used gestures that have been defined based on intuition, the developers’ preferences, and the respective algorithm’s capabilities. In this paper we propose a refined process for deriving gestures from constant user feedback. Using this process every result and design decision is validated in the subsequent step of the process. Therefore, comprehensive feedback can be collected from each of the conducted user studies. Along the process we develop a set of free-hand gestures for controlling music playback. The situational context is analysed to shape the usage scenario and derive an initial set of necessary functions. In a successive user study the set of functions is validated and proposals for gestures are collected from participants for each function. Two gesture sets containing static and dynamic gestures are derived and analysed in a comparative evaluation. The comparative evaluation shows the suitability of the identified gestures and allows further refinement. Our results indicate that the proposed process, that includes validation of each design decision, improves the final results. By using the process to identify gestures for controlling music playback we not only show that the refined process can successfully be applied, but we also provide a consistent gesture set that can serve as a realistic benchmark for gesture recognition algorithms.

Keywords

Music Gestures Camera Gesture recognition CD Process User centred 

References

  1. 1.
    Akers, D.: Wizard of oz for participatory design: inventing a gestural interface for 3d selection of neural pathway estimates. In: Ext. Abstracts CHI, pp. 454–459. ACM (2006)Google Scholar
  2. 2.
    Baudel, T., Beaudouin-Lafon, M.: Charade: remote control of objects using free-hand gestures. In: Proc. Commun ACM, vol. 36, pp. 28–35 (1993)Google Scholar
  3. 3.
    Bergman, J., Kauko, J., Keränen, J.: Hands on music: physical approach to interaction with digital music. In: Proc. Mobile HCI, pp. 33:1–33:11. ACM, New York (2009)Google Scholar
  4. 4.
    Bolt, R.: Put-that-there: voice and gesture at the graphics interface. In: Proc. SIGGRAPH, pp. 262–270. ACM (1980)Google Scholar
  5. 5.
    Choi, E., Bang, W., Cho, S., Yang, J., Kim, D., Kim, S.: Beatbox music phone: gesture-based interactive mobile phone using a tri-axis accelerometer. In: Proc. ICIT, pp. 97–102. IEEE (2005)Google Scholar
  6. 6.
    Description of the microsoft kinect. http://www.xbox.com/kinect/ (2011)
  7. 7.
    Freeman, W., Roth, M.: Orientation histograms for hand gesture recognition. In: Proc. IWAFGR, vol. 12, pp. 296–301 (1995)Google Scholar
  8. 8.
    Freeman, W., Weissman, C.: Television control by hand gestures. In: Proc. IWAFGR, pp. 179–183 (1995)Google Scholar
  9. 9.
    Gavrila, D.: The visual analysis of human movement: a survey. Comput Vision Image Underst 73(1), 82–98 (1999)MATHCrossRefGoogle Scholar
  10. 10.
    Giles, J.: Inside the race to hack the kinect. New Sci 208(2789), 22–23 (2010)CrossRefGoogle Scholar
  11. 11.
    Hayafuchi, K., Suzuki, K.: Musicglove: A wearable musical controller for massive media library. In: Proc. NIME, pp. 259–262. ACM (2008)Google Scholar
  12. 12.
    Henze, N., Boll, S.: Snap and share your photobooks. In: Proc. MM, pp. 409–418. ACM (2008)Google Scholar
  13. 13.
    Henze, N., Boll, S.: Designing a cd augmentation for mobile phones. In: Proc. CHI, pp. 3979–3984. ACM (2010)Google Scholar
  14. 14.
    Hesselmann, T., Boll, S., Heuten, W.: Sciva—designing applications for surface computers. In: Proc. EICS. To appear. ACM (2011)Google Scholar
  15. 15.
    Hofmann, F., Heyer, P., Hommel, G.: Velocity profile based recognition of dynamic gestures with discrete hidden markov models. In: Gesture and Sign Language in Human–Computer Interaction, pp. 81–95 (1998)Google Scholar
  16. 16.
    Kela, J., Korpipää, P., Mäntyjärvi, J., Kallio, S., Savino, G., Jozzo, L., Marca, S.: Accelerometer-based gesture control for a design environment. Pers Ubiquit Comput 10(5), 285–299 (2006)CrossRefGoogle Scholar
  17. 17.
    Kranz, M., Freund, S., Holleis, P., Schmidt, A., Arndt, H.: Developing gestural input. In: Proc. ICDCS, pp. 63–63. IEEE (2006)Google Scholar
  18. 18.
    Kray, C., Nesbitt, D., Dawson, J., Rohs, M.: User-defined gestures for connecting mobile phones, public displays, and tabletops. In: Proc. Mobile HCI, pp. 239–248. ACM (2010)Google Scholar
  19. 19.
    Liang, R., Ouhyoung, M.: A real-time continuous gesture recognition system for sign language. In: Proc. FG, pp. 558–567. IEEE (1998)Google Scholar
  20. 20.
    Masui, T., Tsukada, K., Siio, I.: Mousefield: A simple and versatile input device for ubiquitous computing. In: Proc. UbiComp, pp. 319–328 (2004)Google Scholar
  21. 21.
    Mistry, P., Maes, P.: Sixthsense: a wearable gestural interface. In: Proc. SIGGRAPH ASIA. ACM (2009)Google Scholar
  22. 22.
    Moeslund, T.B., Störring, M., Granum, E.: A natural interface to a virtual environment through computer vision-estimated pointing gestures. In: Gesture and Sign Language in Human–Computer Interaction, pp. 59–63 (2002)Google Scholar
  23. 23.
    Morris, M., Wobbrock, J., Wilson, A.: Understanding users’ preferences for surface gestures. In: Proc. GI, pp. 261–268. Canadian Information Processing Society (2010)Google Scholar
  24. 24.
    Nielsen, M., Störring, M., Moeslund, T., Granum, E.: A procedure for developing intuitive and ergonomic gesture interfaces for hci. In: Gesture-based Communication in Human–Computer Interaction, pp. 105–106 (2004)Google Scholar
  25. 25.
    Pylvänäinen, T.: Accelerometer based gesture recognition using continuous hmms. In: Pattern Recognition and Image Analysis pp. 639–646 (2005)Google Scholar
  26. 26.
    Rico, J., Brewster, S.: Usable gestures for mobile interfaces: evaluating social acceptability. In: Proc. CHI, pp. 887–896. ACM (2010)Google Scholar
  27. 27.
    Schlömer, T., Poppinga, B., Henze, N., Boll, S.: Gesture recognition with a wii controller. In: Proc. TEI, pp. 11–14. ACM (2008)Google Scholar
  28. 28.
    Stenger, B., Woodley, T., Cipolla, R.: A vision-based remote control. In: Computer Vision, pp. 233–262 (2010)Google Scholar
  29. 29.
    Symbian moove—gesture controlled music player. http://www.eyesight-tech.com/ (2010)
  30. 30.
    Wobbrock, J., Morris, M., Wilson, A.: User-defined gestures for surface computing. In: Proc. CHI, pp. 1083–1092. ACM (2009)Google Scholar
  31. 31.
    Wobbrock, J., Wilson, A., Li, Y.: Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes. In: Proc. UIST, pp. 159–168. ACM (2007)Google Scholar
  32. 32.
    Wu, J., Pan, G., Zhang, D., Qi, G., Li, S.: Gesture recognition with a 3-d accelerometer. In: Proc. Ubiquitous Intelligence and Computing, pp. 25–38 (2009)Google Scholar
  33. 33.
    Wu, Y., Huang, T.: Vision-based gesture recognition: a review. In: Gesture-Based Communication in Human–Computer Interaction, pp. 103–115 (1999)Google Scholar

Copyright information

© Springer-Verlag 2011

Authors and Affiliations

  • Andreas Löcken
    • 1
  • Tobias Hesselmann
    • 2
  • Martin Pielot
    • 2
  • Niels Henze
    • 1
  • Susanne Boll
    • 1
  1. 1.University of OldenburgOldenburgGermany
  2. 2.OFFIS, Institute for Information TechnologyOldenburgGermany

Personalised recommendations