Abstract
Gaussian Mixture Model (GMM) has been applied to clustering with wide applications in image segmentation, object detection and so on. Many algorithms were proposed to learn GMM with appropriate number of Gaussian components automatically determined. Lagrange Ying-Yang alternation method (LYYA) is one of them and it has advantages of no priors as well as the posterior probability bounded by traditional probability space. This paper aims to investigate the performance of LYYA, in comparisons with other methods including Bayesian Ying-Yang (BYY) learning, Rival penalized competitive learning (RPCL), hard-cut Expectation Maximization (EM) method, and classic EM with Bayesian Information Criterion (BIC). Systematic simulations show that LYYA is generally more robust than others on the data generated by varying sample size, data dimensionality and real components number. Unsupervised image segmentation results on Berkeley datasets also confirm LYYA advantages when comparing to the Mean shift and Multiscale graph decomposition algorithms.
This work was supported by the Zhi-Yuan chair professorship start-up grant (WF220103010) from Shanghai Jiao Tong University.
Shikui Tu was supported by the Tenure-track associate professorship start-up grant from Shanghai Jiao Tong University.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Constantinopoulos, C., Titsias, M.K., Likas, A.: Bayesian feature and model selection for Gaussian mixture models. IEEE Trans. Pattern Anal. Mach. Intell. 28(6), 1013–1018 (2006)
Xu, L., Krzyzak, A., Oja, E.: Rival penalized competitive learning for clustering analysis, RBF net and curve detection. IEEE Trans. Neural Netw. 4(4), 636–649 (1993)
Xu, L.: Bayesian-Kullback coupled Ying-Yang machines: unified learnings and new results on vector quantization. In: Proceedings of International Conference on Neural Information Processing, pp. 977–988 (1995)
Figueiredo, M.A.T., Jain, A.K.: Unsupervised learning of finite mixture models. IEEE Trans. Pattern Anal. Mach. Intell. 24(3), 381–396 (2002)
Jaakkola, T.S., Jordan, M.I.: Bayesian parameter estimation via variational methods. Stat. Comput. 10(1), 25–37 (2000)
Shi, L., Tu, S., Xu, L.: Learning Gaussian mixture with automatic model selection: A comparative study on three Bayesian related approaches. A special issue on Machine learning and intelligence science: IScIDE2010 (B). J. Front. Electr. Electron. Eng. China 6(2), 215–244 (2011)
Chen, G., Heng, P.A., Xu, L.: Projection-embedded BYY learning algorithm for Gaussian mixture-based clustering. SpringerOpen J. Appl. Inform. 1(2) (2014)
Xu, L.: Further advances on Bayesian Ying-Yang harmony learning. SpringerOpen J. Appl. Inform. 2(5), (2015)
Comaniciu, D., Meer, P.: Mean shift: a robust approach toward feature space analysis. IEEE Trans. Pattern Anal. Mach. Intell. 24(5), 603–619 (2002)
Carson, C., Belongie, S., Greenspan, H., Malik, J.: Blobworld: image segmentation using expectation-maximization and its application to image querying. IEEE Trans. Pattern Anal. Mach. Intell. 24(8), 1026–1038 (2002)
Nikou, C., Likas, A.C., Galatsanos, N.P.: A Bayesian framework for image segmentation with spatially varying mixtures. IEEE Trans. Image Process. Publ. IEEE Sig. Process. Soc. 19(9), 2278–2289 (2010)
Cour, T., Bènèzit, F., Shi, J.: Spectral segmentation with multiscale graph decomposition. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recogn. 2(2), 1124–1131 (2005)
Carpineto, C., Romano, G.: Consensus clustering based on a new probabilistic rand index with application to subtopic retrieval. IEEE Trans. Pattern Anal. Mach. Intell. 34(12), 2315–2326 (2012)
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Long, W., Tu, S., Xu, L. (2017). A Comparative Study on Lagrange Ying-Yang Alternation Method in Gaussian Mixture-Based Clustering. In: Yin, H., et al. Intelligent Data Engineering and Automated Learning – IDEAL 2017. IDEAL 2017. Lecture Notes in Computer Science(), vol 10585. Springer, Cham. https://doi.org/10.1007/978-3-319-68935-7_53
Download citation
DOI: https://doi.org/10.1007/978-3-319-68935-7_53
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-68934-0
Online ISBN: 978-3-319-68935-7
eBook Packages: Computer ScienceComputer Science (R0)