Abstract
This paper describes a visual interface that recognizes the command request of a person by inferring the intention to travel in a desired direction at a certain speed from the person’s head movements. A rotation and a vertical motion indicate the intent to change direction and speed respectively. The context for which this solution is intended is that of wheelchair bound individuals. This paper describes work in progress that provides a proof of concept tested on static images. Results show that the symmetry property of the head can be used to detect a change in its position and can therefore serve as a visual intent indicator. The solution described in this paper, focusing on the specific task of head pose estimation, intends to provide a contribution to the realisation of an enabled environment allowing people with severe disabilities and the elderly to be more independent and active in society.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Luhandjula, K.T., van Wyk, B.J., Kith, K., van Wyk, M.A.: Eye detection for fatigue assessment. In: Proceeding of the Seventeenth International Symposium of the Pattern Recognition Society of South Africa, Parys, South Africa (2006)
Kanno, T., Nakata, K., Furuta, K.: Method for team intention inference. Human-Computer Studies 58, 393–413 (2003)
Ma, B., Zhang, W., Shan, S., Chen, X., Gao, W.: Robust Head Pose Estimation using LGBP. In: Proceedings of the 18th International Conference on Pattern Recognition, vol. 2, pp. 512–515 (2006)
Vatahska, T., Bennewitz, M., Behnke, S.: Feature-based Head Pose Estimation from Images. In: Proceedings of the IEEE-RAS 7th International Conference on Humanoid Robots (Humanoids), Pittsburgh, USA (2007)
Fitzpatrick, P.: Head pose estimation without manual initialization. Term Paper for MIT Course. MIT, Cambridge (2001)
Gourier, N., Maisonnasse, J., Hall, D., Crowley, J.L.: Head Pose Estimation on Low Resolution Images. Perception, recognition and integration for interactive environments (2006)
Tu, J., Fu, Y., Hu, Y., Huang, T.: Evaluation of Head Pose Estimation for Studio Data. In: Stiefelhagen, R., Garofolo, J.S. (eds.) CLEAR 2006. LNCS, vol. 4122, pp. 281–290. Springer, Heidelberg (2007)
Christensen, H.V., Garcia, J.C.: Book Infrared Non-Contact Head Sensor, for Control of Wheelchair Movements. In: Pruski, A., Knops, H. (eds.) Assistive Technology: From Virtuality to Reality, pp. 336–340. IOS Press, Amsterdam (2003)
Kuno, Y., Shimada, N., Shirai, Y.: Look where you are going [robotic wheelchair]. IEEE Robotics & Automation Magazine 10(1), 26–34 (2003)
Matsumotot, Y., Ino, T., Ogsawara, T.: Development of intelligent wheelchair system with face and gaze based interface. In: Proceedings of the 10th IEEE International Workshop on Robot and Human Interactive Communication, Paris, France, pp. 262–267 (2001)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Luhandjula, T., Monacelli, E., Hamam, Y., van Wyk, B.J., Williams, Q. (2009). Visual Intention Detection for Wheelchair Motion. In: Bebis, G., et al. Advances in Visual Computing. ISVC 2009. Lecture Notes in Computer Science, vol 5876. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-10520-3_38
Download citation
DOI: https://doi.org/10.1007/978-3-642-10520-3_38
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-10519-7
Online ISBN: 978-3-642-10520-3
eBook Packages: Computer ScienceComputer Science (R0)