Advertisement

The Role of Operation Granularity in Search-Based Learning of Latent Tree Models

  • Tao Chen
  • Nevin L. Zhang
  • Yi Wang
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6797)

Abstract

Latent tree (LT) models are a special class of Bayesian networks that can be used for cluster analysis, latent structure discovery and density estimation. A number of search-based algorithms for learning LT models have been developed. In particular, the HSHC algorithm by [1] and the EAST algorithm by [2] are able to deal with data sets with dozens to around 100 variables. Both HSHC and EAST aim at finding the LT model with the highest BIC score. However, they use another criterion called the cost-effectiveness principle when selecting among some of the candidate models during search. In this paper, we investigate whether and why this is necessary.

Keywords

Bayesian Network Candidate Model Penalty Term Search Operator Latent Node 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Zhang, N., Kočka, T.: Efficient learning of hierarchical latent class models. In: Proc. of the 16th IEEE International Conference on Tools with Artificial Intelligence (2004)Google Scholar
  2. 2.
    Chen, T., Zhang, N., Wang, Y.: Efficient model evaluation in the search-based approach to latent structure discovery. In: Proc. of 4th European Workshop on Probabilistic Graphical Model (2008)Google Scholar
  3. 3.
    Zhang, N.: Hierarchical latent class models for cluster analysis. J. Mach. Learn. Res. 5 (2004)Google Scholar
  4. 4.
    Lazarsfeld, P., Henry, N.: Latent structure analysis. Houghton Mifflin, Boston (1968)zbMATHGoogle Scholar
  5. 5.
    Elidan, G., Friedman, N.: Learning hidden variable networks: the information bottleneck approach. J. Mach. Learn. Res. 6 (2005)Google Scholar
  6. 6.
    Zhang, N.: Discovery of latent structures: Experience with the coil challenge 2000 data set. In: Shi, Y., van Albada, G.D., Dongarra, J., Sloot, P.M.A. (eds.) ICCS 2007. LNCS, vol. 4490, pp. 26–34. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  7. 7.
    Zhang, N., Yuan, S., Chen, T., Wang, Y.: Latent tree models and diagnosis in traditional chinese medicine. Artificial Intelligence in Medicine (42) (2008)Google Scholar
  8. 8.
    Pearl, J.: Probabilistic reasoning in intellegent systems. Morgan Kaufmann, San Mateo (1988)Google Scholar
  9. 9.
    Wang, Y., Zhang, N.L., Chen, T.: Latent tree models and approximate inference in bayesian networks. Journal of Artificial Intelligence Research 32(879-900) (2008)Google Scholar
  10. 10.
    Geiger, D., Heckerman, D., Meek, C.: Asymptotic model selection for directed networks with hidden variables. In: Proc. of the 12th Conference on Uncertainties in Artificial Intelligence (1996)Google Scholar
  11. 11.
    Chickering, D.M.: Optimal structure identification with greedy search. J. Mach. Learn. Res. 3 (2002)Google Scholar
  12. 12.
    Chen, T.: Search-based learning of latent tree models. PhD dissertation, The Hong Kong University of Science and Technology, Department of Computer Science and Engineering (2009)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Tao Chen
    • 1
  • Nevin L. Zhang
    • 2
  • Yi Wang
    • 3
  1. 1.Shenzhen Institute of Advanced TechnologyChinese Academy of SciencesShenzhenChina
  2. 2.Department of Computer Science & EngineeringThe Hong Kong University of Science & TechnologyKowloonHong Kong
  3. 3.Department of Computer ScienceNational University of SingaporeSingaporeSingapore

Personalised recommendations