Estimation of Multiple Periodic Motions from Video
- 12k Downloads
The analysis of periodic or repetitive motions is useful in many applications, both in the natural and the man-made world. An important example is the recognition of human and animal activities. Existing methods for the analysis of periodic motions first extract motion trajectories, e.g. via correlation, or feature point matching. We present a new approach, which takes advantage of both the frequency and spatial information of the video. The 2D spatial Fourier transform is applied to each frame, and time-frequency distributions are then used to estimate the time-varying object motions. Thus, multiple periodic trajectories are extracted and their periods are estimated. The period information is finally used to segment the periodically moving objects. Unlike existing methods, our approach estimates multiple periodicities simultaneously, it is robust to deviations from strictly periodic motion, and estimates periodicities superposed on translations. Experiments with synthetic and real sequences display the capabilities and limitations of this approach. Supplementary material is provided, showing the video sequences used in the experiments.
KeywordsVideo Sequence Motion Estimation Periodic Motion Real Sequence Period Estimate
- 1.Boyd, J., Little, J.: Motion from transient oscillations. In: Proceedings of the Conference on Computer Vision and Pattern Recognition, CVPR (2001)Google Scholar
- 8.Pepin, M.P., Clark, M.P.: On the performance of several 2-d harmonic retrieval techniques. In: Conference Record of the Twenty-Eighth Asilomar Conference on Signals, Systems and Computers, vol. 1, pp. 254–258 (1994)Google Scholar
- 9.Briassouli, A., Ahuja, N.: Fusion of frequency and spatial domain information for motion analysis. In: ICPR 2004, Proceedings of the 17th International Conference on Pattern Recognition, vol. 2, pp. 175–178 (2004)Google Scholar
- 11.Kojima, A., Sakurai, N., Kishigami, J.I.: Motion detection using 3D-FFT spectrum. In: 1993 IEEE International Conference on Acoustics, Speech, and Signal Processing, vol. 5, pp. 213–216 (1993)Google Scholar