A Gesture Based System for Context – Sensitive Interaction with Smart Homes
This paper introduces a system for gesture based interaction with smart environments. The framework we present connects gesture recognition results with control commands for appliances in a smart home that are accessed through a middleware based on the ISO 24752 standard URC (Universal Remote Console). Gesture recognition is realized by applying three dimensional acceleration sensor information of the WiiMote from Nintendo. This information is trained to a toolkit for gesture recognition that implements machine learning algorithms well known from speech recognition. Our study focuses on two interaction concepts with the aim to exploit the context and special home scenarios. This serves to reduce the number of gestures while in parallel retaining the control complexity on a high level. A user test, also with older persons, compares both concepts and evaluates their efficiency by observing the response times and the subjective impressions of the test persons.
KeywordsGesture Recognition Hand Gesture Dynamic Time Warp Smart Home Test Person
Unable to display preview. Download preview PDF.
- 3.Machate, J.: Being natural: On the use of multimodal interaction concepts in smart homes (1999)Google Scholar
- 4.Song, W.G., Lim, J.T.: Design and Management in Smart Home. In: Independent Living for Persons with Disabilities and Elderly People, pp. 33–37. IOS Press, Amsterdam (2003)Google Scholar
- 6.Blumendorf, M., Feuerstack, S., Albayrak, S.: Multimodal smart home user interfaces. In: Mukasa, K., Holzinger, A., Karshmer, A. (eds.) Intelligent User Interfaces for Ambient Assisted Living: Proceedings of the First International Workshop IUI4AAL 2008. IRB Verlag (2008)Google Scholar
- 7.Westermann, T.: I’m home: Smartphone-enabled gestural interaction with multi-modal smart-home systems. In: Informatiktage. LNI, vol. S-9, pp. 137–140. GI (2010)Google Scholar
- 8.Neßelrath, R., Alexandersson, J.: A 3d gesture recognition system for multimodal dialog systems. In: Proceedings of the 6th IJCAI Workshop on Knowledge and Reasoning in Practical Dialogue Systems. Twenty-First International Joint Conference On Artificial Intelligence (IJCAI 2009), in Conjunction with 6th IJCAI Workshop on Knowledge and Reasoning in Practical Dialogue Systems (KRPD 2009), Pasadena,, California, United States, July 12, pp. 46–51 (2009)Google Scholar
- 9.Frey, J., Schulz, C.H., Neßelrath, R., Stein, V., Alexandersson, J.: Towards pluggable user interfaces for people with cognitive disabilities. In: Proceedings of the 3rd International Conference on Health Informatics. HEALTHINF 2010, in Conjunction with Proceedings of the 3rd International Conference on Health Informatics, Valencia, Spain, January 20-23. Springer, Heidelberg (2010)Google Scholar
- 10.Neßelrath, R.: TaKG - Ein Toolkit zur automatischen Klassifikation von Gesten. Mastersthesis, DFKI (2008)Google Scholar
- 11.ten Holt, G., Reinders, M., Hendriks, E.: Multi-dimensional dynamic time warping for gesture recognition. In: Thirteenth Annual Conference of the Advanced School for Computing and Imaging (2007)Google Scholar