Combining Multidimensional Scaling with Artificial Neural Networks

  • Gintautas Dzemyda
  • Olga Kurasova
  • Julius Žilinskas
Chapter
Part of the Springer Optimization and Its Applications book series (SOIA, volume 75)

Abstract

The combination and integrated use of data visualization methods of a different nature are under a rapid development. The combination of different methods can be applied to make a data analysis, while minimizing the shortcomings of individual methods. This chapter is devoted to visualization methods based on an artificial neural network. The fundamentals of artificial neural networks that are essential for investigating their potential to visualize multidimensional data are presented below. A biological neuron is introduced here. The model of an artificial neuron is presented, too. Structures of one-layer and multilayer feed-forward neural networks are investigated. Learning algorithms are described. Some artificial neural networks, widely used for visualization of multidimensional data, are overviewed, such as a self-organizing map, neural gas, curvilinear component analysis, auto-associative neural network, and NeuroScale. Much attention is paid to two strategies of the combination of multidimensional scaling and artificial neural network. The first of them is based on the integration of a self-organizing map or neural gas with the multidimensional scaling. The second one is based on the minimization of Stress using a feed-forward neural network SAMANN. The possibility to train the artificial neural network by multidimensional scaling results is discussed, too.

References

  1. 1.
    Agrafiotis, D.K., Lobanov, V.S.: Nonlinear mapping networks. J. Chem. Inform. Comput. Sci. 40, 1356–1362 (2000). DOI 10.1021/ci000033yCrossRefGoogle Scholar
  2. 4.
    Baldi, P., Hornik, K.: Neural networks and principal component analysis: learning from examples without local minima. Neural Network 2(1), 53–58 (1989). DOI 10.1016/0893-6080(89)90014-2CrossRefGoogle Scholar
  3. 7.
    Bernatavičienė, J., Dzemyda, G., Kurasova, O., Marcinkevičius, V.: Optimal decisions in combining the SOM with nonlinear projection methods. Eur. J. Oper. Res. 173(3), 729–745 (2006). DOI 10.1016/j. ejor.2005.05.030CrossRefMATHGoogle Scholar
  4. 9.
    Bernatavičienė, J., Dzemyda, G., Kurasova, O., Marcinkevičius, V., Medvedev, V.: The problem of visual analysis of multidimensional medical data. In: Models and Algorithms for Global Optimization, vol. 4, pp. 277–298. Springer, New York (2007). DOI 10.1007/ 978-0-387-36721-7_17Google Scholar
  5. 31.
    Cox, T.F., Cox, M.A.A.: Multidimensional Scaling, 2nd edn. Chapman & Hall/CRC, Boca Raton (2001)MATHGoogle Scholar
  6. 33.
    Deboeck, G.: Public domain vs. commercial tools for creating neural self-organizing maps. PC AI Mag. 12(6), 27–30 (1999)Google Scholar
  7. 37.
    Demartines, P., Herault, J.: Curvilinear component analysis: A self-organizing neural-network for nonlinear mapping of data sets. IEEE Trans. Neural Network 8(1), 148–154 (1997)CrossRefGoogle Scholar
  8. 38.
    DeMers, D., Cottrell, G.W.: Non-linear dimensionality reduction. In: Advances in Neural Information Processing Systems 5, [NIPS Conference], pp. 580–587. Morgan Kaufmann, San Francisco, CA (1993)Google Scholar
  9. 47.
    Dzemyda, G.: Visualization of a set of parameters characterized by their correlation matrix. Comput. Stat. Data Anal. 36(1), 15–30 (2001). DOI 10.1016/S0167-9473(00)00030-XCrossRefMATHMathSciNetGoogle Scholar
  10. 51.
    Dzemyda, G., Kurasova, O.: Comparative analysis of the graphical result presentation in the SOM software. Informatica 13(3), 275–286 (2002)MATHGoogle Scholar
  11. 52.
    Dzemyda, G., Kurasova, O.: Parallelization of the SOM-based integrated mapping. In: Rutkowski, L., Siekmann, J.H., Tadeusiewicz, R., Zadeh, L.A. (eds.) ICAISC’04: Proceedings on Artificial Intelligence and Soft Computing. Lecture Notes in Computer Science, pp. 178–183. Springer, New York (2004). DOI 110.1007/b98109Google Scholar
  12. 53.
    Dzemyda, G., Kurasova, O.: Heuristic approach for minimizing the projection error in the integrated mapping. Eur. J. Oper. Res. 171(3), 859–878 (2006). DOI  10.1016/j.ejor.2004.09.011 CrossRefMATHMathSciNetGoogle Scholar
  13. 55.
    Dzemyda, G., Kurasova, O., Medvedev, V.: Dimension reduction and data visualization using neural networks. In: Maglogiannis, I., Karpouzis, K., Wallace, M., Soldatos, J. (eds.) Emerging Artificial Intelligence Applications in Computer Engineering, Frontiers in Artificial Intelligence and Applications, vol. 160, pp. 25–49. IOS, Amsterdam (2007)Google Scholar
  14. 56.
    Dzemyda, G., Tiešis, V.: Visualization of multidimensional objects and the socio-economical impact to activity in EC RTD databases. Informatica 12(2), 239–262 (2001)MATHGoogle Scholar
  15. 59.
    Estévez, P.A., Figueroa, C.J., Saito, K.: Cross-entropy embedding of high-dimensional data using the neural gas model. Neural Network 18(5–6), 727–737 (2005). DOI 10.1016/j.neunet.2005.06.010CrossRefGoogle Scholar
  16. 64.
    Flexer, A.: On the use of self-organizing maps for clustering and visualization. Intell. Data Anal. 5(5), 373–384 (2001)MATHGoogle Scholar
  17. 66.
    Frank, A., Asuncion, A.: UCI machine learning repository (2010). URL http://www.ics.uci.edu/~mlearn/MLRepository.html
  18. 86.
    Hassoun, M.H.: Fundamentals of Artificial Neural Networks. MIT, Cambridge, MA (1995)MATHGoogle Scholar
  19. 90.
    Haykin, S.: Neural Networks: A Comprehensive Foundation. Prentice Hall PTR, Upper Saddle River, NJ (1998)Google Scholar
  20. 91.
    Hellemaa, P.: The development of coastal dunes and their vegetation in finland. Ph.D. thesis, University of Helsinki, Department of Geography (1998)Google Scholar
  21. 92.
    Hinton, G.E., Salakhutdinov, R.R.: Reducing the dimensionality of data with neural networks. Science 313(5786), 504–507 (2006). DOI 10. 1126/science.1127647CrossRefMATHMathSciNetGoogle Scholar
  22. 104.
    Ivanikovas, S., Dzemyda, G., Medvedev, V.: Large datasets visualization with neural network using clustered training data. In: ADBIS’08: Proceedings of the 12th East European conference on Advances in Databases and Information Systems. Lecture Notes in Computer Science, pp. 143–152. Springer, Berlin (2008). DOI 10.1007/ 978-3-540-85713-6_11Google Scholar
  23. 106.
    Ivanikovas, S., Medvedev, V., Dzemyda, G.: Parallel realizations of the SAMANN algorithm. In: ICANNGA ’07: Proceedings of the 8th international conference on Adaptive and Natural Computing Algorithms, Part II. Lecture Notes in Computer Science, pp. 179–188. Springer, Berlin (2007). DOI  10.1007/978-3-540-71629-7_21
  24. 107.
    Jain, A.K., Mao, J., Mohiuddin, K.: Artificial neural networks: A tutorial. IEEE Comput. 29(3), 31–44 (1996). DOI 10.1109/2.485891CrossRefGoogle Scholar
  25. 120.
    Kohonen, T.: Self-Organizing Maps, 3rd edn. Springer Series in Information Science. Springer, Berlin (2001)CrossRefMATHGoogle Scholar
  26. 121.
    Kohonen, T.: Overture. In: Self-Organizing Neural Networks: Recent Advances and Applications, pp. 1–12. Springer, New York, NY (2002)Google Scholar
  27. 122.
    Konig, A.: Interactive visualization and analysis of hierarchical neural projections for data mining. IEEE Trans. Neural Network 11(3), 615–624 (2000). DOI 10.1109/72.846733CrossRefGoogle Scholar
  28. 123.
    Kraaijveld, M.A., Mao, J., Jain, A.K.: A non-linear projection method based on kohonen’s topology preserving maps. In: ICPR92: Proceedings of 11th International Conference on Pattern Recognition, pp. II:41–45. IEEE Computer Society Press, Los Alamitos, CA (1992)Google Scholar
  29. 124.
    Kramer, M.A.: Nonlinear principal component analysis using autoassociative neural networks. AIChE J. 37(2), 233–243 (1991). DOI 110.1002/aic.690370209CrossRefGoogle Scholar
  30. 128.
    Kurasova, O., Molytė, A.: Combination of vector quantization and visualization. In: MLDM’09: Proceedings of the 6th International Conference on Machine Learning and Data Mining in Pattern Recognition, pp. 29–43. Springer, Berlin (2009). DOI http://dx.doi.org/10.1007/978-3-642-03070-3_3
  31. 129.
    Kurasova, O., Molytė, A.: Investigation of the quality of mapping vectors obtained by quantization methods. In: Sakalauskas, L., Skiadas, C., Zavadskas, E.K. (eds.) ASMDA’09: Proceedings of XIII International Conference on Applied Stochastic Models and Data Analysis, pp. 269–273. Technika, Vilnius (2009)Google Scholar
  32. 130.
    Kurasova, O., Molytė, A.: Integration of the self-organizing map and neural gas with multidimensional scaling. Inform. Tech. Contr. 40(1), 12–20 (2011)Google Scholar
  33. 131.
    Kurasova, O., Molytė, A.: Quality of quantization and visualization of vectors obtained by neural gas and self-organizing map. Informatica 22(1), 115–134 (2011)MathSciNetGoogle Scholar
  34. 134.
    Lee, J.A., Lendasse, A., Donckers, N., Verleysen, M.: A robust nonlinear projection method. In: European Symposium on Artificial Neural Networks, pp. 13–20 (2000)Google Scholar
  35. 135.
    Lee, J.A., Lendasse, A., Verleysen, M.: Nonlinear projection with curvilinear distances: Isomap versus curvilinear distance analysis. Neurocomputing 57, 49–76 (2004). DOI 10.1016/j.neucom.2004.01.007CrossRefGoogle Scholar
  36. 144.
    Lowe, D.: Radial basis function networks. In: The Handbook of Brain Theory and Neural Networks, pp. 779–782. MIT, Cambridge, MA (1998)Google Scholar
  37. 145.
    Lowe, D., Tipping, M.E.: Feed-forward neural networks and topographic mappings for exploratory data analysis. Neural Comput. Appl. 4(2), 83–95 (1996)CrossRefGoogle Scholar
  38. 146.
    Lowe, D., Tipping, M.E.: Neuroscale: Novel topographic feature extraction with radial basis function networks. In: Mozer, M., Jordan, M., Petsche, T. (eds.) Advances in Neural Information Processing Systems, vol. 9, pp. 543–549. MIT, Cambridge, MA (1997)Google Scholar
  39. 147.
    Mao, J., Jain, A.: Artificial neural networks for feature-extraction and multivariate data projection. IEEE Trans. Neural Network 6(2), 296–317 (1995)CrossRefGoogle Scholar
  40. 148.
    Martinetz, T., Schulten, K.: A neural-gas network learns topologies. Artif. Neural Network I, 397–402 (1991)Google Scholar
  41. 152.
    McCulloch, W.S., Pitts, W.: A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys. 5, 115–133 (1943)CrossRefMATHMathSciNetGoogle Scholar
  42. 154.
    Medvedev, V., Dzemyda, G.: Optimization of the local search in the training for SAMANN neural network. J. Global Optim. 35(4), 607–623 (2006). DOI 10.1007/s10898-005-5368-1CrossRefMATHMathSciNetGoogle Scholar
  43. 157.
    Medvedev, V., Dzemyda, G., Kurasova, O., Marcinkevičius, V.: Efficient data projection for visual analysis of large data sets using neural networks. Informatica 22(4), 507–520 (2011)MATHGoogle Scholar
  44. 167.
    Oja, E.: Data compression, feature extraction, and autoassociation in feedforward neural networks. Artif. Neural Network 287(1), 737–745 (1991)Google Scholar
  45. 177.
    Raudys, Š.: Statistical and Neural Classifiers: an Integrated Approach to Design. Springer, London (2001)CrossRefGoogle Scholar
  46. 181.
    Ridder, D.D., Duin, R.P.W.: Sammon’s mapping using neural networks: a comparison. Pattern Recogn. Lett. 18, 1307–1316 (1997)CrossRefGoogle Scholar
  47. 182.
    Rosenblatt, F.: Principles of Neurodynamics. Spartan, New York (1962)MATHGoogle Scholar
  48. 189.
    Sanger, T.D.: Optimal unsupervised learning in a single-layer linear feedforward neural network. Neural Network 2, 459–473 (1989)CrossRefGoogle Scholar
  49. 192.
    Silipo, R.: Neural networks. In: Intelligent Data Analysis: An Introduction, pp. 269–320. Springer, New York (2003)Google Scholar
  50. 201.
    Tipping, M.E.: Topographic mappings and feed-forward neural networks. Ph.D. thesis, Aston University, Aston Street, Birmingham B4 7ET, UK (1996)Google Scholar
  51. 208.
    Ultsch, A., Siemon, P.H.: Exploratory data analysis: Using kohonen networks on transputers (1989). Technical Report 329, University of Dortmund, Dortmund, GermanyGoogle Scholar
  52. 214.
    van Wezel, M.C., Kosters, W.A.: Nonmetric multidimensional scaling: Neural networks versus traditional techniques. Intell. Data Anal. 8(6), 601–613 (2004)Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2013

Authors and Affiliations

  • Gintautas Dzemyda
    • 1
  • Olga Kurasova
    • 1
  • Julius Žilinskas
    • 1
  1. 1.Institute of Mathematics and InformaticsVilnius UniversityVilniusLithuania

Personalised recommendations