Manifold Learning for Multi-dimensional Auto-regressive Dynamical Models

  • Fabio CuzzolinEmail author
Part of the Advances in Pattern Recognition book series (ACVPR)


We present a general differential-geometric framework for learning distance functions for dynamical models. Given a training set of models, the optimal metric is selected among a family of pullback metrics induced by the Fisher information tensor through a parameterized automorphism. The problem of classifying motions, encoded as dynamical models of a certain class, can then be posed on the learnt manifold. In particular, we consider the class of multidimensional autoregressive models of order 2. Experimental results concerning identity recognition are shown that prove how such optimal pullback Fisher metrics greatly improve classification performances.


Riemannian Manifold Hide Markov Model Classification Performance Autoregressive Model Geodesic Distance 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Amari, S.-I.: Differential Geometric Methods in Statistics. Springer, Berlin (1985) CrossRefGoogle Scholar
  2. 2.
    Bar-Hillel, A., Hertz, T., Shental, N., Weinshall, D.: Learning distance functions using equivalence relations In: ICML03, pp. 11–18 (2003) Google Scholar
  3. 3.
    Belkin, M., Niyogi, P.: Semi-supervised learning on Riemannian manifolds. Mach. Learn. 56, 209–239 (2004) zbMATHCrossRefGoogle Scholar
  4. 4.
    Bengio, Y., Paiement, J.-F., Vincent, P.: Out-of-sample extensions for LLE, Isomap, MDS, eigenmaps, and spectral clustering. Technical Report (2003) Google Scholar
  5. 5.
    Bilenko, M., Basu, S., Mooney, R.J.: Integrating constraints and metric learning in semi-supervised clustering. In: Proc. of ICML’04 (2004) Google Scholar
  6. 6.
    Bissacco, A., Chiuso, A., Soatto, S.: Classification and recognition of dynamical models: The role of phase, independent components, kernels and optimal transport. IEEE Trans. Pattern Anal. Mach. Intell. 29(11), 1958–1972 (2007) CrossRefGoogle Scholar
  7. 7.
    Burman, P.: A comparative study of ordinary cross-validation, v-fold cross-validation and the repeated learning-testing methods. Biometrika 76(3), 503–514 (1989) MathSciNetzbMATHGoogle Scholar
  8. 8.
    Chaudhry, R., Ravichandran, A., Hager, G., Vidal, R.: Histograms of oriented optical flow and Binet–Cauchy kernels on nonlinear dynamical systems for the recognition of human actions. In: Proc. of CVPR’09, pp. 1932–1939 (2009) Google Scholar
  9. 9.
    Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum likelihood from incomplete data via the EM algorithm. J. R. Stat. Soc. B 39(1), 1–38 (1977) MathSciNetzbMATHGoogle Scholar
  10. 10.
    Do, M.N.: Fast approximation of Kullback–Leibler distance for dependence trees and hidden Markov models. IEEE Signal Process. Lett. 10(4), 115–118 (2003) MathSciNetCrossRefGoogle Scholar
  11. 11.
    Doretto, G., Chiuso, A., Wu, Y.N., Soatto, S.: Dynamic textures. Int. J. Comput. Vis. 51(2), 91–109 (2003) zbMATHCrossRefGoogle Scholar
  12. 12.
    Eick, C.F., Rouhana, A., Bagherjeiran, A., Vilalta, R.: Using clustering to learn distance functions for supervised similarity assessment. In: ICML and Data Mining (2005) Google Scholar
  13. 13.
    Elliot, R., Aggoun, L., Moore, J.: Hidden Markov Models: Estimation and Control. Springer, Berlin (1995) Google Scholar
  14. 14.
    Frosini, P.: Measuring shape by size functions. In: Proceedings of SPIE on Intelligent Robotic Systems, vol. 1607, pp. 122–133 (1991) Google Scholar
  15. 15.
    Gross, R., Shi, J.: The CMU motion of body (Mobo) database. Technical Report, CMU (2001) Google Scholar
  16. 16.
    Hanzon, B., Peeters, R.L.M.: Aspects of Fisher geometry for stochastic linear systems. In: Open Problems in Mathematical Systems and Control Theory, pp. 27–30 (2002) Google Scholar
  17. 17.
    Itoh, M., Shishido, Y.: Fisher information metric and poisson kernels. Differ. Geom. Appl. 26(4), 347–356 (2008) MathSciNetzbMATHCrossRefGoogle Scholar
  18. 18.
    Kerckhove, M.: Computation of ridges via pullback metrics from scale space. In: Book Scale-Space Theories in Computer Vision. Lecture Notes in Computer Science, vol. 1682/1999, pp. 82–92. Springer, Berlin (1999) CrossRefGoogle Scholar
  19. 19.
    Kim, T.K., Cipolla, R.: Canonical correlation analysis of video volume tensors for action categorization and detection. IEEE Trans. Pattern Anal. Mach. Intell. 31(8), 1415–1428 (2009) CrossRefGoogle Scholar
  20. 20.
    Kullback, S., Leibler, R.A.: On information and sufficiency. Ann. Math. Stat. 22, 79–86 (1951) MathSciNetzbMATHCrossRefGoogle Scholar
  21. 21.
    Lebanon, G.: Metric learning for text documents. IEEE Trans. Pattern Anal. Mach. Intell. 28(4), 497–508 (2006) CrossRefGoogle Scholar
  22. 22.
    Lee, L., Grimson, W.: Gait analysis for recognition and classification. In: AFGR’02, pp. 155–162 (2002) Google Scholar
  23. 23.
    Martin, R.J.: A metric for ARMA processes. IEEE Trans. Signal Process. 48(4), 1164–1170 (2000) MathSciNetzbMATHCrossRefGoogle Scholar
  24. 24.
    Murray, M.K., Rice, J.W.: Differential Geometry and Statistics. CRC Press, Boca Raton (1993) zbMATHGoogle Scholar
  25. 25.
    Peeters, R.L.M., Hanzon, B.: On the Riemannian manifold structure of classes of linear systems (2003). Equadiff Google Scholar
  26. 26.
    Pitis, G.: On some submanifolds of a locally product manifold. Kodai Math. J. 9(3), 327–333 (1986) MathSciNetzbMATHCrossRefGoogle Scholar
  27. 27.
    Rijkeboer, A.L.: Differential geometric models for time-varying coefficients of autoregressive processes. PhD Thesis, Tilburg University (1994) Google Scholar
  28. 28.
    Rogez, G., Rihan, J., Ramalingam, S., Orrite, C., Torr, P.H.S.: Randomized trees for human pose detection. In: CVPR’08 (2008) Google Scholar
  29. 29.
    Roweis, S., Saul, L.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000) CrossRefGoogle Scholar
  30. 30.
    Schultz, M., Joachims, T.: Learning a distance metric from relative comparisons. In: NIPS (2004) Google Scholar
  31. 31.
    Shental, N., Hertz, T., Weinshall, D., Pavel, M.: Adjustment learning and relevant component analysis. In: ECCV’02 (2002) Google Scholar
  32. 32.
    Smola, A.J., Vishwanathan, S.V.N.: Hilbert space embeddings in dynamical systems. In: Proc. of IFAC’03, pp. 760–767 (2003) Google Scholar
  33. 33.
    Sundaresan, A., Chowdhury, A.K.R., Chellappa, R.: A hidden Markov model based framework for recognition of humans from gait sequences. In: ICP’03, pp. II: 93–96 (2003) Google Scholar
  34. 34.
    Tsang, I.W., Kwok, J.T., Bay, C.W., Kong, H.: Distance metric learning with kernels. In: Proc. of the International Conference on Artificial Intelligence (2003) Google Scholar
  35. 35.
    Wang, Y., Mori, G.: Max-margin hidden conditional random fields for human action recognition. In: CVPR, pp. 872–879 (2009) Google Scholar
  36. 36.
    Xie, L., Ugrinovskii, A., Petersen, I.R.: Probabilistic distances between finite-state finite-alphabet hidden Markov models. In: Proc. of CDC’03, pp. 5347–5352 (2003) Google Scholar
  37. 37.
    Xing, E.P., Ng, A.Y., Jordan, M.I., Russel, S.: Distance metric learning with applications to clustering with side information. In: NIPS’03 (2003) Google Scholar
  38. 38.
    Yuan, J.S., Liu, Z.C., Wu, Y.: Discriminative subvolume search for efficient action detection. In: CVPR, pp. 2442–2449 (2009) Google Scholar
  39. 39.
    Zhang, Z.: Learning metrics via discriminant kernels and multidimensional scaling: toward expected Euclidean representation. In: ICML’03, Hong Kong (2003) Google Scholar

Copyright information

© Springer-Verlag London Limited 2011

Authors and Affiliations

There are no affiliations available

Personalised recommendations