Skip to main content
Log in

Optimization of the Local Search in the Training for SAMANN Neural Network

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

In this paper, we discuss the visualization of multidimensional data. A well-known procedure for mapping data from a high-dimensional space onto a lower-dimensional one is Sammon’s mapping. This algorithm preserves as well as possible all interpattern distances. We investigate an unsupervised backpropagation algorithm to train a multilayer feed-forward neural network (SAMANN) to perform the Sammon’s nonlinear projection. Sammon mapping has a disadvantage. It lacks generalization, which means that new points cannot be added to the obtained map without recalculating it. The SAMANN network offers the generalization ability of projecting new data, which is not present in the original Sammon’s projection algorithm. To save computation time without losing the mapping quality, we need to select optimal values of control parameters. In our research the emphasis is put on the optimization of the learning rate. The experiments are carried out both on artificial and real data. Two cases have been analyzed: (1) training of the SAMANN network with full data set, (2) retraining of the network when the new data points appear.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Anderson, D. and McNeill, G. (1992), Artificial neural networks technology. DACS State-of-the-Art Report ELIN: A011, Rome Laboratory, RL/C3C Griffiss AFB, NY 13441-5700, 20 Aug.

  2. R.A. Fisher (1936) ArticleTitleThe use of multiple measurements in taxonomic problem Ann. Eugenics 7 179–188

    Google Scholar 

  3. D.M. Hawkins D. Bradu G.V. Kass (1984) ArticleTitleLocation of several outliers in multiple regression data using elemental sets Technometrics 26 197–208 Occurrence Handle10.2307/1267545

    Article  Google Scholar 

  4. Jain, A.K. and Dubes, R.C. (1988), Algorithms for Clustering Data. Prentice-Hall.

  5. Jain, A.K. and Mao, J. (1992), Artificial neural network for nonlinear projection of multivariate data, Neural Networks. In: IJCNN, International Joint Conference on Volume 3, 7–11 June 1992, vol. 3, pp. 335–340.

  6. A.K. Jain R. Duin J. Mao (2000) ArticleTitleStatistical pattern recognition: A review IEEE Trans. Pattern Analysis and Machine Intelligence 22 IssueID1 4–37 Occurrence Handle10.1109/34.824819

    Article  Google Scholar 

  7. Lerner, B., Gutterman, H., Aladjem, M., Dinstein, I. and Romem, Y. Feature extraction by neural network nonlinear mapping for pattern classification.

  8. J. Mao A.K. Jain (1995) ArticleTitleArtificial neural networks for feature extraction and multivariate data projection IEEE Trans. Neural Networks 6 296–317 Occurrence Handle10.1109/72.363467

    Article  Google Scholar 

  9. J. Mockus (1989) Bayesian Approach to Global Optimization Kluwer Dordrecht

    Google Scholar 

  10. J. Mockus W. Eddy A. Mockus L. Mockus G. Reklaitis (1996) Bayesian Heuristic Approach to Discrete and Global Optimization Kluwer Dordrecht, Netherlands

    Google Scholar 

  11. D. Ridder Particlede R.P.W. Duin (1997) ArticleTitleSammon’s mapping using neural networks: A comparison Pattern Recognition Letters 18 1307–1316 Occurrence Handle10.1016/S0167-8655(97)00093-7

    Article  Google Scholar 

  12. D. Ruppert R.J. Carroll (1980) ArticleTitleTrimmed least squares estimation in the linear model Journal of the American Statistical Association 75 828–838 Occurrence Handle10.2307/2287169

    Article  Google Scholar 

  13. J.J. Sammon (1969) ArticleTitleA nonlinear mapping for data structure analysis IEEE Trans. Computer C 18 IssueID5 401–409

    Google Scholar 

  14. Pohlheim, H. (2005) Multidimensional scaling for evolutionary algorithms – Visualization of the path through search space and solution space using Sammon mapping. to Artificial Life Special Issue – The State of the Art in Visualizing Complex Adaptive Systems.

  15. InstitutionalAuthorNameMathworks (1994) The Matlab – User Guide The Mathworks, Inc. Natick, MA

    Google Scholar 

  16. Riedmiller, M., Braun, H. (1993), A direct adaptive method for faster backpropagation learning: The RPROP algorithm. In Ruspini, H. (ed.), Proceedings of the IEEE Int. Conf. on Neural Networks (ICNN), pp. 586–591.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Viktor Medvedev.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Medvedev, V., Dzemyda, G. Optimization of the Local Search in the Training for SAMANN Neural Network. J Glob Optim 35, 607–623 (2006). https://doi.org/10.1007/s10898-005-5368-1

Download citation

  • Received:

  • Accepted:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-005-5368-1

Keywords

Navigation