Skip to main content

A Comparative Study on Lagrange Ying-Yang Alternation Method in Gaussian Mixture-Based Clustering

  • Conference paper
  • First Online:
Intelligent Data Engineering and Automated Learning – IDEAL 2017 (IDEAL 2017)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 10585))

  • 2094 Accesses

Abstract

Gaussian Mixture Model (GMM) has been applied to clustering with wide applications in image segmentation, object detection and so on. Many algorithms were proposed to learn GMM with appropriate number of Gaussian components automatically determined. Lagrange Ying-Yang alternation method (LYYA) is one of them and it has advantages of no priors as well as the posterior probability bounded by traditional probability space. This paper aims to investigate the performance of LYYA, in comparisons with other methods including Bayesian Ying-Yang (BYY) learning, Rival penalized competitive learning (RPCL), hard-cut Expectation Maximization (EM) method, and classic EM with Bayesian Information Criterion (BIC). Systematic simulations show that LYYA is generally more robust than others on the data generated by varying sample size, data dimensionality and real components number. Unsupervised image segmentation results on Berkeley datasets also confirm LYYA advantages when comparing to the Mean shift and Multiscale graph decomposition algorithms.

This work was supported by the Zhi-Yuan chair professorship start-up grant (WF220103010) from Shanghai Jiao Tong University.

Shikui Tu was supported by the Tenure-track associate professorship start-up grant from Shanghai Jiao Tong University.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Constantinopoulos, C., Titsias, M.K., Likas, A.: Bayesian feature and model selection for Gaussian mixture models. IEEE Trans. Pattern Anal. Mach. Intell. 28(6), 1013–1018 (2006)

    Article  Google Scholar 

  2. Xu, L., Krzyzak, A., Oja, E.: Rival penalized competitive learning for clustering analysis, RBF net and curve detection. IEEE Trans. Neural Netw. 4(4), 636–649 (1993)

    Article  Google Scholar 

  3. Xu, L.: Bayesian-Kullback coupled Ying-Yang machines: unified learnings and new results on vector quantization. In: Proceedings of International Conference on Neural Information Processing, pp. 977–988 (1995)

    Google Scholar 

  4. Figueiredo, M.A.T., Jain, A.K.: Unsupervised learning of finite mixture models. IEEE Trans. Pattern Anal. Mach. Intell. 24(3), 381–396 (2002)

    Article  Google Scholar 

  5. Jaakkola, T.S., Jordan, M.I.: Bayesian parameter estimation via variational methods. Stat. Comput. 10(1), 25–37 (2000)

    Article  Google Scholar 

  6. Shi, L., Tu, S., Xu, L.: Learning Gaussian mixture with automatic model selection: A comparative study on three Bayesian related approaches. A special issue on Machine learning and intelligence science: IScIDE2010 (B). J. Front. Electr. Electron. Eng. China 6(2), 215–244 (2011)

    Article  Google Scholar 

  7. Chen, G., Heng, P.A., Xu, L.: Projection-embedded BYY learning algorithm for Gaussian mixture-based clustering. SpringerOpen J. Appl. Inform. 1(2) (2014)

    Google Scholar 

  8. Xu, L.: Further advances on Bayesian Ying-Yang harmony learning. SpringerOpen J. Appl. Inform. 2(5), (2015)

    Google Scholar 

  9. Comaniciu, D., Meer, P.: Mean shift: a robust approach toward feature space analysis. IEEE Trans. Pattern Anal. Mach. Intell. 24(5), 603–619 (2002)

    Article  Google Scholar 

  10. Carson, C., Belongie, S., Greenspan, H., Malik, J.: Blobworld: image segmentation using expectation-maximization and its application to image querying. IEEE Trans. Pattern Anal. Mach. Intell. 24(8), 1026–1038 (2002)

    Article  Google Scholar 

  11. Nikou, C., Likas, A.C., Galatsanos, N.P.: A Bayesian framework for image segmentation with spatially varying mixtures. IEEE Trans. Image Process. Publ. IEEE Sig. Process. Soc. 19(9), 2278–2289 (2010)

    Article  MathSciNet  Google Scholar 

  12. Cour, T., Bènèzit, F., Shi, J.: Spectral segmentation with multiscale graph decomposition. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recogn. 2(2), 1124–1131 (2005)

    Google Scholar 

  13. Carpineto, C., Romano, G.: Consensus clustering based on a new probabilistic rand index with application to subtopic retrieval. IEEE Trans. Pattern Anal. Mach. Intell. 34(12), 2315–2326 (2012)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Shikui Tu or Lei Xu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Long, W., Tu, S., Xu, L. (2017). A Comparative Study on Lagrange Ying-Yang Alternation Method in Gaussian Mixture-Based Clustering. In: Yin, H., et al. Intelligent Data Engineering and Automated Learning – IDEAL 2017. IDEAL 2017. Lecture Notes in Computer Science(), vol 10585. Springer, Cham. https://doi.org/10.1007/978-3-319-68935-7_53

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-68935-7_53

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-68934-0

  • Online ISBN: 978-3-319-68935-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics