A Visual Hand Motion Detection Algorithm for Wheelchair Motion
This paper describes an algorithm for a visual human-machine interface that infers a person’s intention from the motion of the hand. The context for which this solution is intended is that of wheelchair bound individuals whose intentions of interest are the direction and speed variation of the wheelchair indicated by a video sequence of the hand in rotation and in vertical motion respectively. For speed variation recognition, a symmetry based approach is used where the center of gravity of the resulting symmetry curve indicates the progressive position of the hand. For direction recognition, non-linear classification methods are used on the statistics of the symmetry curve. Results show that the symmetry property of the hand in both motions can serve as an intent indicator when a sequence of fifteen consecutive frames is used for recognition. This paper also shows less satisfactory results when fewer frames are used as an attempt to achieve faster recognition, and proposes a Brute force extrapolation algorithm to better the results.
KeywordsSupport Vector Machine Vertical Motion Dorsal View Plan Recognition Intention Recognition
Unable to display preview. Download preview PDF.
- [Bishop 1995]Bishop, C.: Neural Networks for Pattern Recognition. Oxford University Press, New York (1995)Google Scholar
- [Christensen and Garcia 2003]Christensen, H.V., Garcia, J.C.: Infrared Non-Contact Head Sensor, for Control of Wheelchair Movements. In: Pruski, A., Knops, H. (eds.) Assistive Technology: From Virtuality to Reality, pp. 336–340. IOS Press, Amsterdam (2003)Google Scholar
- [Cristianini and Shawe-Taylor 2000]Cristianini, N., Shawe-Taylor, J.: An introduction to Support Vector Machines: and other kernel-based learning methods. Cambridge University Press, New York (2000)Google Scholar
- [Demeester et al. 2003]Demeester, E., Nuttin, M., Vanhooydonck, D., et al.: Assessing the User’s Intent Using Bayes’ Rule: Application to Wheelchair Control. In: Proc. 1st Int. Workshop on Advanced in Service Robotics, Bardolino, Italy, pp. 117–124 (2003)Google Scholar
- [Geib 2002]Geib, C.W.: Problems with intent Recognition for Elder Care. In: Proc. Association for the Advancement of Artificial Intelligence Workshop on automation as Caregiver, Menlo Park, CA, USA, pp. 13–17 (2002)Google Scholar
- [Jia and Hu 2005]Jia, P., Hu, H.: Head Gesture based Control of an Intelligent Wheelchair. In: Proc. Eleventh Annual Conference of Chinese Automation and Computing Society, UK, Sheffield, pp. 191–203 (2005)Google Scholar
- [Kanno et al. 2003]
- [Lesh and Sidner 1999]Lesh, N., Rich, C., Sidner, C.L.: Using Plan Recognition in Human-Computer Collaboration. In: Proc. Seventh International Conference on User Modeling, Banff, Canada, pp. 23–32 (1999)Google Scholar
- [Luhandjula, Hamam et al. 2009]Luhandjula, T., Hamam, Y., van Wyk, B.J., et al.: Symmetry-based head pose estimation for intention detection. In: Proc. 20th Annual Symposium of the Pattern Recognition Association of South Africa, Stellenbosch, South Africa, pp. 93–98 (2009)Google Scholar
- [Luhandjula, Monacelli et al. 2009]Luhandjula, T., Monacelli, E., Hamam, Y., et al.: Visual Intention Detection for Wheelchair Motion. In: Proc 5th International symposium on visual computing, Las Vegas, USA, pp. 407–416 (2009)Google Scholar