Abstract
The Self-Organizing Map (SOM) is a popular unsupervised neural network able to provide effective clustering and data visualization for multidimensional input datasets. In this paper, we present an application of the simulated annealing procedure to the SOM learning algorithm with the aim to obtain a fast learning and better performances in terms of quantization error. The proposed learning algorithm is called Fast Learning Self-Organized Map, and it does not affect the easiness of the basic learning algorithm of the standard SOM. The proposed learning algorithm also improves the quality of resulting maps by providing better clustering quality and topology preservation of input multi-dimensional data. Several experiments are used to compare the proposed approach with the original algorithm and some of its modification and speed-up techniques.
Similar content being viewed by others
References
Kohonen T (1995) Self-organizing maps. Springer, Berlin
Valova I, Beaton D, Buer A, MacLean D (2010) Fractal initialization for high-quality mapping with self-organizing maps. Neural Comput Applic 19(7):953–966
Sun A (2010) Improved SOM Algorithm-HDSOM Applied in text clustering. Multimedia information networking and security (MINES), 2010 International Conference on, pp 306–309
Fiannaca A, Rizzo R, Urso A, Gaglio S (2008) A new SOM initialization algorithm for nonvectorial data. Proc of KES 1:41–48
Rizzo R, Chella A (2006) A comparison between habituation and conscience mechanism in self-organizing maps. IEEE Trans Neural Netw 17(3):807–810
DeSieno D (1988) Adding a conscience to competitive learning. In: Proceeding of ICNN’88, international conference on neural networks, IEEE Service Center, Piscataway, NJ, pp 117–124
Berglund E, Sitte J (2006) The parameterless self-organizing map algorithm. IEEE Trans Neural Netw 17(2):305–316
Haese K (2001) Auto-SOM: recursive parameter estimation for guidance of self-organizing feature maps. Neural Comput 13(3):595–619
Kirkpatrick S, Gelatt C, Vecchi M (1983) Optimization by simulated annealing. Science 220(4598):671–680
Metropolis N, Rosenbluth AW, Rosenbluth MN, Teller AH, Teller E (1953) Equation of state calculations by fast computing machines. J Chem Phys 21(6):1087–1092
Graepel T, Burger M, Obermayer K (1998) Self-organizing maps: generalizations and new optimization techniques. Neurocomputing 21:173–190
Douzono H, Hara S, Noguchi Y (2000) A clustering method of chromosome fluorescence profiles using modified self organizing map controlled by simulated annealing. IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN’00), 4
Haese K (1999) Kalman filter implementation of self-organizing feature maps. Neural Comput 11(5):1211–1233
Haese K (1998) Self-organizing feature map with self-adjusting learning parameters. IEEE Trans Neural Netw 9(6):1270–1278
Berglund E, Sitte J (2003) The parameter-less SOM algorithm. In: Proceeding of ANZIIS. pp 159–164
Ingber L (1989) Very fast simulated re-annealing. J Math Comput Model 12(8):967–973
Ingber L (1996) Adaptive simulated annealing (ASA): lessons learned. J Control Cybern 25(1):33–54
Goppert J, Rosenstiel W (1996) Regularized SOM-Training: a solution to the topology-approximation dilemma, University of Tbingen
National Cancer Institute, DTP AIDS antiviral screen dataset. http://dtp.nci.nih.gov/docs/aids/aids/data.html
Di Fatta G, Fiannaca A, Rizzo R, Urso A, Berthold MR, Gaglio S (2006) Context-aware visual exploration of molecular databases. Workshops IEEE international conference on data mining (ICDM 2006), pp 136–141
Deboeck G, Kohonen T (1998) Visual explorations in finance with self-organizing-maps. Springer, London
Ripley BD (1996) Pattern recognition and neural networks. Cambridge University Press, Cambridge
Kaski S (1997) Data exploration using self-organizing maps, PhD thesis, Helsinki University of Technology, Helsinki
Flexer A (1999) On the use of self-organizing maps for clustering and visualization. Comput Sci 1704:80–88
Fiannaca A, Di Fatta G, Gaglio S, Rizzo R, Urso A (2007) Improved SOM learning using simulated annealing. LNCS 4668:279–288
Ultsch A, Korus D (1995) Integration of neural networks and knowledge-based systems. In: Proceeding IEEE on international conference on neural networks, pp 425–426
Adams R, Bischof L (1994) Seeded region growing. IEEE Trans PAMI 16(6):641–647
Halkidi M, Vazirgiannis M (2001) clustering validity assessment: finding the optimal partitioning of data set. In: Proceedings of ICDM 2001, pp 187–194
Bauer HU, Pawelzik KR (1992) Quantifying the neighborhood preservation of self-organizing feature maps. IEEE Trans Neural Netw 3(4):570–579
Villman T, Der R, Hermann M, Martinetz T (1997) Topology preservation in self-organizing feature maps: exact definition and measurement. IEEE Trans Neural Netw 8(2):256–266
Kiviluoto K (1996) Topology preservation in self-organizing maps. In: Proceedings of ICNN 1996, pp 294–299
Vidaurre D, Muruzabal J (2007) A quick assessment of topology preservation for SOM structures. IEEE Trans Neural Netw 18(5):1524–1528
Yang Y, Kamel MS (2006) An aggregated clustering approach using multi-ant colonies algorithms. Pattern Recognit 39(7):1278–1289
Halkidi M, Batistakis Y, Vazirgiannis M (2002) Cluster validity methods: part II. SIGMOD Rec 31(2):40–45
Fiannaca A, Di Fatta G, Rizzo R, Urso A, Gaglio S (2009) Clustering quality and topology preservation in fast learning SOMs. Neural Netw World 19:625–639
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Fiannaca, A., Di Fatta, G., Rizzo, R. et al. Simulated annealing technique for fast learning of SOM networks. Neural Comput & Applic 22, 889–899 (2013). https://doi.org/10.1007/s00521-011-0780-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-011-0780-6