Heterogenous Data Fusion via a Probabilistic Latent-Variable Model
In a pervasive computing environment, one is facing the problem of handling heterogeneous data from different sources, transmitted over heterogeneous channels and presented on heterogeneous user interfaces. This calls for adaptive data representations keeping as much relevant information as possible while keeping the representation as small as possible. Typically, the gathered data can be high-dimensional vectors with different types of attributes, e.g. continuous, binary and categorical data. In this paper we present – as a first step – a probabilistic latent-variable model, which is capable of fusing high-dimensional heterogenous data into a unified low-dimensional continuous space, and thus brings great benefits for multivariate data analysis, visualization and dimensionality reduction. We adopt a variational approximation to the likelihood of observed data and describe an EM algorithm to fit the model. The advantages of the proposed model are illustrated on toy data and used on real-world painting image data for both visualization and recommendation.
Unable to display preview. Download preview PDF.
- 2.Cohn, D.: Informed projections. In: Becker, S., Thrun, S., Obermayer, K. (eds.) Advances in Neural Information Processing Systems, vol. 15, MIT Press, Cambridge (2003)Google Scholar
- 3.Collins, M., Dasgupta, S., Schapire, R.: A generalization of principal component analysis to the exponential family. In: Leen, T.K., Dietterich, T.G., Tresp, V. (eds.) Advances in Neural Information Processing Systems, vol. 13, MIT Press, Cambridge (2001)Google Scholar
- 4.Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Satatistical Learning. Springer, Heidelberg (2001)Google Scholar
- 5.Jaakkola, T., Jordan, M.: Bayesian parameter estimation via variational methods. Statistics and Computing, 25–37 (2000)Google Scholar
- 9.Tipping, M.E.: Probabilistic visualization of high-dimensional binary data. In: Kearns, M.S., Solla, S.A., Cohn, D.A. (eds.) Advances in Neural Information Processing Systems, vol. 11, pp. 592–598. MIT Press, Cambridge (1999)Google Scholar