A Tuned Eigenspace Technique for Articulated Motion Recognition
In this paper, we introduce a tuned eigenspace technique so as to classify human motion. The method presented here overcomes those problems related to articulated motion and dress texture effects by learning various human motions in terms of their sequential postures in an eigenspace. In order to cope with the variability inherent to articulated motion, we propose a method to tune the set of sequential eigenspaces. Once the learnt tuned eigenspaces are at hand, the recognition task then becomes a nearest-neighbor search over the eigenspaces. We show how our tuned eigenspace method can be used for purposes of real-world and synthetic pose recognition. We also discuss and overcome the problem related to clothing texture that occurs in real-world data, and propose a background subtraction method to employ the method in out-door environment. We provide results on synthetic imagery for a number of human poses and illustrate the utility of the method for the purposes of human motion recognition.
KeywordsRecognition Rate Human Motion Motion Trajectory Motion Line Posture Matrix
Unable to display preview. Download preview PDF.
- 1.Turk, M.A., Pentland, A.P.: Face recognition using eigenfaces. In: International Conference on Computer Vision and Pattern Recognition, pp. 586–591 (1991)Google Scholar
- 2.Murase, H., Nayar, S.K.: Visual learning and recognition of 3-d objects from appearance. International Journal of Computer Vision 14(5), 39–50 (1995)Google Scholar
- 5.Shechtman, E., Irani, M.: Space–time behavioral correlation. In: Computer Vision and Pattern Recognition, pp. I:405–412 (2005)Google Scholar
- 7.Rahman, M.M., Ishikawa, S.: A robust recognition method for partially occluded/destroyed objects. In: Sixth Asian Conference on Computer Vision, pp. 984–988 (1996)Google Scholar
- 10.Yilmaz, A., Gokmen, M.: Eigenhill vs. eigenface and eigenedge. Pattern Recognition (34), 181–184 (2001)Google Scholar