Abstract
Mobile devices have become an integral part of the twenty-first century lifestyle. From social networking and business to day-to-day scheduling and multimedia applications, smart-phones and other portable handsets are now the go-to devices for interaction in the digital world. Currently, mobile devices typically utilise direct user interfaces, such as touch screens or keyboards, where interactions are performed directly by controlling graphical elements or controls on the interface. This project looks to bring device interaction out of the virtual world and into the physical world. Through augmenting existing mobile technologies with custom electronic hardware, it is possible to create a system that can incorporate free gestures within a portable context. With this approach, portable applications can break away from the virtual world and enable the mobile platform to be harnessed as a physical augmented interface. This concept can be exploited for applications within a wide range of contexts including musical performance, games, learning and teaching, and beyond.
This chapter is an updated and extended version of the following paper, published here with kind permission of the Chartered Institute for IT (BCS) and of EVA London Conferences: M. Benatan, I. Symonds and K.Ng, “Mobile motion: multimodal device augmentation for musical applications.” In S. Dunn, J. P. Bowen, and K. Ng (eds.). EVA London 2011 Conference Proceedings. Electronic Workshops in Computing (eWiC), British Computer Society, 2011. http://www.bcs.org/ewic/eva2011 (accessed 26 May 2013).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Nintendo of Europe GmbH. Wii – Nintendo (2011). http://wii.nintendo.co.uk. Accessed 31 Mar 2011.
Micah, M. PlayStation move explained: An interview with Anton Mikhailov, 2010. http://www.gamexplain.com/article-72-1272999307-playstation-move-xplained-an-interview-with-anton-mikhailov.html. Accessed 18 May 2013.
Shotton, J. et al. (2011). Real-time human pose recognition in parts from single depth Im-ages. In IEEE computer vision and pattern recognition 2011, Colorado, 21–25 June 2011. IEEE, 2011.
Gruenstein, A. et al. A multimodal home entertainment interface via a mobile device. In Proceedings of the 9th SIGdial workshop on discourse and dialogue, Columbus, 19–20 June 2008, (pp.11–29). Stroudsburg: Association for Computational Linguistics.
Young, D. (2003). Wireless sensor system for measurement of violin bowing parameters. In Proceedings of Stockholm Music Acoustics Conference, Stockholm, 6–9 August 2003. KTH Speech, Music and Hearing.
Wilson, F. (2000). The hand. New York: Vintage Books.
Jensenius, A. R. et al. (2010). Musical gestures: Concepts and methods in research. In R. I. Godøy, & M. Leman (Eds.), Musical gestures: Sound, movement, and meaning. London: Routledge. ISBN 978-0-415-99887-1. Chapter 2.
Baillie, L., Schatz, R., Simon, R., & Grassner, A. (2005). Designing Mona: User interactions with multimodal mobile applications. In Human computer interaction, Las Vegas.
Kopp, S., & Wachsmuth, I. (2010). Gesture in embodied communication and human-computer interaction. In S. Kopp, & I. Wachsmuth (Eds.), 8th international gesture workshop, GW 2009, Bielefeld, Germany, 25–27 February 2009, revised selected papers. Lecture Notes in Computer Science, vol. 5934, Berlin: Springer. doi: 10.1007/978-3-642-12553-9
Wachsmuth, I., & Sowa, T. (2002). Gesture and sign languages in human-computer interaction. In International gesture workshop, GW 2001, London, 18–20 April 2001, revised papers. Berlin/New York: Springer.
Junker, H. (2005). Human activity recognition and gesture spotting with body-worn sensors. Konstanz: Hartung-Gorre.
Yang, M.-H., & Ahuja, N. (2001). Face detection and gesture recognition for human-computer interaction. Boston: Kluwer.
Müller, M. (2007). Information retrieval for music and motion. Berlin: Springer.
Bradshaw, D., & Ng, K. (2008). Tracking conductors hand movements using multiple Wiimotes. In International conference on automated solutions for cross media content and multi-channel distribution, Florence.
Bradshaw, D., & Ng, K. (2009). Motion capture, analysis and feedback to support learning conducting. In International Computer Music Conference, Montreal, Quebec.
Leonard, J., & Ng, K. (2011). Music via motion: A distributed framework for interactive multi-media performance. In Distributed multimedia systems, Florence.
Android. Android Developers, 2013. http://developer.android.com. Accessed 11 May 2013.
Apple. Apple Developer, 2013. http://developer.apple.com. Accessed: 11 May 2013.
Apple. iPod Touch, 2013. http://www.apple.com/ipodtouch/. Accessed 11 May 2013.
Keene, T. The Apple barrier: an open source interface to the iPhone. In Proceedings of electronic visualisation and the arts (EVA 2011), EVA London, 2011. http://ewic.bcs.org/content/ConWebDoc/40651. Accessed 18 May 2013
HTC. HTC Desire Product Overview. http://www.htc.com/www/smartphones/htc-desire-c/. Accessed 18 Mar 2013.
Samsung. Samsung Galaxy S. http://galaxys.samsungmobile.com. Accessed 7 Mar 2011, now SII http://www.samsung.com/global/microsite/galaxys2/html/index.html. Accessed 18 May 2013.
Nintendo. Wii, 2007. http://wii.nintendo.com. Accessed 11 May 2013.
Bezdicek, M. et al. Portable absolute position tracking system for human hand fingertips. In Proceedings of Virtual Concept 2006, Cancún, 27 November–1 December 2006.
Meir, R. A beginner’s guide to Android. Google I/O 2010, San Francisco, USA, 19–20 May 2010. Session Video. http://www.google.com/events/io/2010/sessions.html. Accessed 12 May 2013.
Leadbetter, R. Tech analysis: Kinect, EUROGAMER.net, 15 June 2010. http://www.eurogamer.net/articles/digitalfoundry-kinect-tech-analysis. Accessed 12 May 2013.
Ng, K. (2008). Technology-enhanced learning for i-Maestro framework and tools. In S. Dunn, S. Keene, G. Mallen, & J. P. Bowen (Eds.), Proceedings of EVA London 2008 Conference: Electronic visualisation and the arts, London, 22–24 July 2008. London: British Computer Society. http://ewic.bcs.org/content/ConWebDoc/20604. Accessed 18 May 2013.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag London
About this chapter
Cite this chapter
Benatan, M., Ng, K. (2013). Mobile Motion: Multimodal Device Augmentation for Musical Applications. In: Bowen, J., Keene, S., Ng, K. (eds) Electronic Visualisation in Arts and Culture. Springer Series on Cultural Computing. Springer, London. https://doi.org/10.1007/978-1-4471-5406-8_13
Download citation
DOI: https://doi.org/10.1007/978-1-4471-5406-8_13
Published:
Publisher Name: Springer, London
Print ISBN: 978-1-4471-5405-1
Online ISBN: 978-1-4471-5406-8
eBook Packages: Computer ScienceComputer Science (R0)