Learning Conditional Latent Structures from Multiple Data Sources

  • Viet Huynh
  • Dinh Phung
  • Long Nguyen
  • Svetha Venkatesh
  • Hung H. Bui
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9077)


Data usually present in heterogeneous sources. When dealing with multiple data sources, existing models often treat them independently and thus can not explicitly model the correlation structures among data sources. To address this problem, we propose a full Bayesian nonparametric approach to model correlation structures among multiple and heterogeneous datasets. The proposed framework, first, induces mixture distribution over primary data source using hierarchical Dirichlet processes (HDP). Once conditioned on each atom (group) discovered in previous step, context data sources are mutually independent and each is generated from hierarchical Dirichlet processes. In each specific application, which covariates constitute content or context(s) is determined by the nature of data. We also derive the efficient inference and exploit the conditional independence structure to propose (conditional) parallel Gibbs sampling scheme. We demonstrate our model to address the problem of latent activities discovery in pervasive computing using mobile data. We show the advantage of utilizing multiple data sources in terms of exploratory analysis as well as quantitative clustering performance.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Antoniak, C.: Mixtures of Dirichlet processes with applications to Bayesian nonparametric problems. The Annals of Statistics 2(6), 1152–1174 (1974)CrossRefzbMATHMathSciNetGoogle Scholar
  2. 2.
    Dubey, A., Hefny, A., Williamson, S., Xing, E.P.: A non-parametric mixture model for topic modeling over time (2012). arXiv preprint arXiv:1208.4411
  3. 3.
    Eagle, N., Pentland, A.: Reality mining: Sensing complex social systems. Personal and Ubiquitous Computing 10(4), 255–268 (2006)CrossRefGoogle Scholar
  4. 4.
    Ferguson, T.: A Bayesian analysis of some nonparametric problems. The Annals of Statistics 1(2), 209–230 (1973)CrossRefzbMATHMathSciNetGoogle Scholar
  5. 5.
    Frey, B., Dueck, D.: Clustering by passing messages between data points. Science 315, 972–976 (2007)CrossRefzbMATHMathSciNetGoogle Scholar
  6. 6.
    MacEachern, S.: Dependent nonparametric processes. In: ASA Proceedings of the Section on Bayesian Statistical Science. pp. 50–55 (1999)Google Scholar
  7. 7.
    Neal, R.: Markov chain sampling methods for Dirichlet process mixture models. Journal of computational and graphical statistics 9(2), 249–265 (2000)MathSciNetGoogle Scholar
  8. 8.
    Nguyen, V., Phung, D., Venkatesh, S. Nguyen, X., Bui, H.: Bayesian nonparametric multilevel clustering with group-level contexts. In: Proc. of International Conference on Machine Learning (ICML), pp. 288–296. Beijing, China (2014)Google Scholar
  9. 9.
    Phung, D., Nguyen, T.C., Gupta, S., Venkatesh, S.: Learning latent activities from social signals with hierarchical Dirichlet process. In: Sukthankar, G., et al. (ed.) Handbook on Plan, Activity, and Intent Recognition, pp. 149–174. Elsevier (2014)Google Scholar
  10. 10.
    Phung, D., Nguyen, X., Bui, H., Nguyen, T., Venkatesh, S.: Conditionally dependent Dirichlet processes for modelling naturally correlated data sources. Tech. rep., Pattern Recognition and Data Analytics, Deakin University(2012)Google Scholar
  11. 11.
    Pitman, J.: Poisson-Dirichlet and GEM invariant distributions for split-and-merge transformations of an interval partition. Combinatorics, Probability and Computing 11(05), 501–514 (2002)CrossRefzbMATHMathSciNetGoogle Scholar
  12. 12.
    Rodriguez, A., Dunson, D., Gelfand, A.: The nested Dirichlet process. Journal of the American Statistical Association 103(483), 1131–1154 (2008)CrossRefzbMATHMathSciNetGoogle Scholar
  13. 13.
    Sethuraman, J.: A constructive definition of Dirichlet priors. Statistica Sinica 4(2), 639–650 (1994)zbMATHMathSciNetGoogle Scholar
  14. 14.
    Teh, Y., Jordan, M., Beal, M., Blei, D.: Hierarchical Dirichlet processes. Journal of the American Statistical Association 101(476), 1566–1581 (2006)CrossRefzbMATHMathSciNetGoogle Scholar
  15. 15.
    Thibaux, R., Jordan, M.: Hierarchical Beta processes and the Indian buffet process. In: Proc. of Int. Conf. on Artificial Intelligence and Statistics (AISTAT), vol. 11, pp. 564–571 (2007)Google Scholar
  16. 16.
    Wulsin, D., Jensen, S., Litt, B.: A hierarchical dirichlet process model with multiple levels of clustering for human eeg seizure modeling. In: Proc. of International Conference on Machine Learning (ICML) (2012)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Viet Huynh
    • 1
  • Dinh Phung
    • 1
  • Long Nguyen
    • 2
  • Svetha Venkatesh
    • 1
  • Hung H. Bui
    • 3
  1. 1.Pattern Recognition and Data Analytics CentreDeakin UniversityGeelongAustralia
  2. 2.Department of StatisticsUniversity of MichiganAnn ArborUSA
  3. 3.Adobe ResearchSan FranciscoUSA

Personalised recommendations