Advertisement

Approximate Kalman Filters for Embedding Author-Word Co-occurrence Data over Time

  • Purnamrita Sarkar
  • Sajid M. Siddiqi
  • Geoffrey J. Gordon
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4503)

Abstract

We address the problem of embedding entities into Euclidean space over time based on co-occurrence data. We extend the CODE model of [1] to a dynamic setting. This leads to a non-standard factored state space model with real-valued hidden parent nodes and discrete observation nodes. We investigate the use of variational approximations applied to the observation model that allow us to formulate the entire dynamic model as a Kalman filter. Applying this model to temporal co-occurrence data yields posterior distributions of entity coordinates in Euclidean space that are updated over time. Initial results on per-year co-occurrences of authors and words in the NIPS corpus and on synthetic data, including videos of dynamic embeddings, seem to indicate that the model results in embeddings of co-occurrence data that are meaningful both temporally and contextually.

Keywords

Kalman Filter Social Network Analysis State Space Model Belief State Observation Model 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Globerson, A., Chechik, G., Pereira, F., Tishby, N.: Euclidean embedding of co-occurrence data. In: Proc. Eighteenth Annual Conf. on Neural Info. Proc. Systems, NIPS (2004)Google Scholar
  2. 2.
    Sarkar, P., Moore, A.: Dynamic social network analysis using latent space models. In: Proc. Nineteenth Annual Conf. on Neural Info. Proc. Systems, NIPS (2005)Google Scholar
  3. 3.
    Berry, M., Dumais, S., Letsche, T.: Computational methods for intelligent information access. In: Proceedings of Supercomputing (1995)Google Scholar
  4. 4.
    Breiger, R.L., Boorman, S.A., Arabie, P.: An algorithm for clustering relational data with applications to social network analysis and comparison with multidimensional scaling. J. of Math. Psych. 12, 328–383 (1975)CrossRefGoogle Scholar
  5. 5.
    Raftery, A.E., Handcock, M.S., Hoff, P.D.: Latent space approaches to social network analysis. J. Amer. Stat. Assoc. 15, 460 (2002)MathSciNetGoogle Scholar
  6. 6.
    Ghahramani, Z., Jordan, M.I.: Factorial hidden Markov models. In: Touretzky, D.S., Mozer, M.C., Hasselmo, M.E. (eds.) Proc. Conf. Advances in Neural Information Processing Systems, NIPS, vol. 8, pp. 472–478. MIT Press, Cambridge (1995)Google Scholar
  7. 7.
    Ghahramani, Z., Hinton, G.E.: Switching state-space models. Technical report, Toronto, Canada (1998)Google Scholar
  8. 8.
    Kalman, R.: A new approach to linear filtering and prediction problems (1960)Google Scholar
  9. 9.
    Jordan, M.I., Ghahramani, Z., Jaakkola, T.S., Saul, L.K.: An Introduction to Variational Methods for Graphical Methods. In: Machine Learning (1998)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Purnamrita Sarkar
    • 1
  • Sajid M. Siddiqi
    • 2
  • Geoffrey J. Gordon
    • 1
  1. 1.Machine Learning Department, Carnegie Mellon University, Pittsburgh, PA 15213 
  2. 2.Robotics Institute, Carnegie Mellon University, Pittsburgh, PA 15213 

Personalised recommendations