Abstract
This paper presents the gesture desk, a new platform for a human-computer interface at a regular computer workplace. It extends classical input devices like keyboard and mouse by arm and hand gestures, without the need to use any inconvenient accessories like data gloves or markers. A central element is a “gesture box” containing two infrared cameras and a color camera which is positioned under a glass desk. Arm and hand motions are tracked in three dimensions. A synchronizer board has been developed to provide an active glare-free IR-illumination for robust body and hand tracking. As a first application, we demonstrate interactive real-time browsing and querying of auditory self-organizing maps (AuSOMs). An AuSOM is a combined visual and auditory presentation of high-dimensional data sets. Moving the hand above the desk surface allows to select neurons on the map and to manipulate how they contribute to data sonification. Each neuron is associated with a prototype vector in high-dimensional space, so that a set of 2D-topologically ordered feature maps is queried simultaneously. The level of detail is selected by hand altitude over the table surface, allowing to emphasize or deemphasize neurons on the map.
Keywords
- Joint Angle
- Exploratory Data Analysis
- Auditory Display
- Gestural Interaction
- Gestural Control
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, access via your institution.
Buying options
Preview
Unable to display preview. Download preview PDF.
References
International Gesture Workshop, GW, Gestures and Sign Languages in Human-Computer Interaction. Lecture Notes in Artificial Intelligence, London, 4, Springer (2001)
Tukey, J.W.: Exploratory Data Analysis. Addison-Wesley, Reading (1977)
Kramer, G. (ed.): Auditory Display - Sonification, Audification, and Auditory Interfaces. Addison-Wesley, Reading (1994)
Hermann, T., Krause, J., Ritter, H.: Real-time control of sonification models with an audio-haptic interface. In: Nakatsu, R., Kawahara, H. (eds.) Proc. of the Int. Conf. on Auditory Display, Int. Community for Auditory Display, pp. 82–86. Int. Community for Auditory Display (2002); Int. Community for Auditory Display (accepted)
Papp III, A.L., Blattner, M.M., Glinert, E.P.: Sonic enhancement of two dimensional graphics displays. In: Kramer, G. (ed.) Auditory Display. ICAD, pp. 447–470. Addison-Wesley, Reading (1994)
Kohonen, T.: Self-organized formation of topologically correct feature maps. Biological Cybernetics 43, 56–69 (1982)
Linear motion and assembly technologies, http://www.boschrexroth.com/BoschRexroth/business_units/brl/en/produkte/profilschienen_fuehrungen/index.jsp
Grosso, M., Quach, R., Otani, E., Zhao, J., Wei, S., Ho, P., Lu, J., Badler, N.: Anthropometry for computer graphics human figures. Tech. Rep. MS-CIS-87-71, University of Pennsylvania, Dept. of Computer and Information Science, Philadelphia, PA (1987)
Lange, C., Hermann, T., Ritter, H.: Holistic body tracking for gestural interfaces. In: International Gesture Workshop (2003) (in this volume)
Fernström, M., McNamara, C.: After direct manipulation - direct sonification. In: Proc. ICAD 1998, British Computer Society (1998)
Cherkassky, V., Mulier, F.: Self-organizing networks for nonparametric regression. From Statistics to Neural Networks, 188–212 (1994)
Kohonen, T.: Self-Organizing maps, 3rd edn. Springer Series in Information Sciences, vol. 30. Springer, Heidelberg (2001)
Nölker, C., Ritter, H.: Visual recognition of continuous hand postures. IEEE Transactions on Neural Networks, Special Issue Multimedia 13(4), 983–994 (2002)
Ritter, H.: The graphical simulation toolkit Neo/NST (2000), http://www.techfak.uni-bielefeld.de/ags/ni/projects/simulation_and_visual/neo/neo_e.html
Fisher, R.A.: UCI repository of maschine learning databases (1999), ftp://ftp.ics.uci.edu/pub/machine-learning-databases/iris
Chowning, J.: The synthesis of complex spectra by means of frequency modulation. Journal of the Audio Engineering Society 21(7), 526–534 (1973)
Hermann, T.: Sonification for exploratory data analysis – demonstrations and sound examples (2002), http://www.techfak.uni-bielefeld.de/~thermann/projects/index.html
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Hermann, T., Henning, T., Ritter, H. (2004). Gesture Desk – An Integrated Multi-modal Gestural Workplace for Sonification. In: Camurri, A., Volpe, G. (eds) Gesture-Based Communication in Human-Computer Interaction. GW 2003. Lecture Notes in Computer Science(), vol 2915. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-24598-8_34
Download citation
DOI: https://doi.org/10.1007/978-3-540-24598-8_34
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-21072-6
Online ISBN: 978-3-540-24598-8
eBook Packages: Springer Book Archive