PRICAI 2000: PRICAI 2000 Topics in Artificial Intelligence pp 510-520 | Cite as
Optimal Design of Neural Nets Using Hybrid Algorithms
Abstract
Selection of the topology of a network and correct parameters for the learning algorithm is a tedious task for designing an optimal Artificial Neural Network (ANN), which is smaller, faster and with a better generalization performance. Genetic algorithm (GA) is an adaptive search technique based on the principles and mechanisms of natural selection and survival of the fittest from natural evolution. Simulated annealing (SA) is a global optimization algorithm that can process cost functions possessing quite arbitrary degrees of nonlinearities, discontinuities and stochasticity but statistically assuring a optimal solution. In this paper we explain how a hybrid algorithm integrating the desirable aspects of GA and SA can be applied for the optimal design of an ANN. This paper is more concerned with the understanding of current theoretical developments of Evolutionary Artificial Neural Networks (EANNs) using GAs and other heuristic procedures and how the proposed hybrid and other heuristic procedures can be combined to produce an optimal ANN.
Keywords
Hybrid Algorithm Global Search Learning Rule Connection Weight Global Optimization AlgorithmPreview
Unable to display preview. Download preview PDF.
References
- 1.Yao X.: Evolving Artificial Neural Networks, Proceedings of the IEEE, 87(9):1, 423–1447, (1999).Google Scholar
- 2.Hart W.E.: A Theoretical Comparison of Evolutionary Algorithms and Simulated Annealing, Proceedings of the Fifth Annual Conference on Evolutionary Programming. MIT press, (1996).Google Scholar
- 3.Frean M.: The Upstart Algorithm: A Method for Constructing and Training Feed Forward Neural Networks, Neural computations Volume 2, pp. 198–209, (1990).CrossRefGoogle Scholar
- 4.Mezard M., Nadal J.P.: Learning in Feed Forward Layered Networks: The Tiling Algorithm, Journal of Physics A, Vol 22, pp. 2191–2204, (1989).CrossRefMathSciNetGoogle Scholar
- 5.Yao X.: A New Simulated Annealing Algorithm, International Journal of Computer Mathematics, 56:161–168, (1995).MATHCrossRefGoogle Scholar
- 6.Boers E.J.W., Kuiper H., Happel B.L.M., Sprinkhuizen-Kuyper I.G.: Designing Modular Artificial Neural Networks, In: H.A. Wijshoff (ed.); Proceedings of Computing Science in The Netherlands, pp. 87–96, (1993).Google Scholar
- 7.Gutjahr S., Ragg T.: Automatic Determination of Optimal Network Topologies Based on Information Theory and Evolution, IEEE Proceedings of the 23rd EUROMICRO Conference, (1997).Google Scholar
- 8.Schiffmann W., Joost M., Werner R.: Comparison of Optimized Backpropagation Algorithms, Proceedings. Of the European Symposium on Artificial Neural Networks, Brussels, pp. 97–104, (1993).Google Scholar
- 9.Mascioli F., Martinelli G.: A Constructive Algorithm for Binary Neural Networks: The Oil Spot Algorithm, IEEE Transaction on Neural Networks, 6(3), pp 794–797, (1995).CrossRefGoogle Scholar
- 10.Porto V.W., Fogel D.B., Fogel L.J.: Alternative Neural Network Training Methods, IEEE Expert, volume 10, no.4, pp. 16–22, (1995).CrossRefGoogle Scholar
- 11.Topchy A.P., Lebedko O.A.: Neural Network Training by Means of Cooperative Evolutionary Search, Nuclear Instruments & Methods In Physics Research, Section A: accelerators, Spectrometers, Detectors and Associated equipment, Volume 389, no. 1–2, pp. 240–241, (1997).CrossRefGoogle Scholar
- 12.Polani D., Miikkulainen R.: Fast Reinforcement Learning Through Eugenic Neuro-Evolution. Technical Report AI99-277, Department of Computer Sciences, University of Texas at Austin, (1999).Google Scholar
- 13.Kitano H.: Designing Neural Networks Using Genetic Algorithms with Graph Generation System, Complex Systems, Volume 4, No.4, pp. 461–476, (1990).MATHGoogle Scholar
- 14.Price K.V.: Genetic Annealing, Dr. Dobbs Journal, Vol.220, pp. 127–132, (1994).Google Scholar
- 15.Stepniewski S.W., Keane A.J.: Pruning Back-propagation Neural Networks Using Modern Stochastic Optimization Techniques, Neural Computing & Applications, Vol. 5, pp. 76–98, (1997).CrossRefGoogle Scholar
- 16.Fullmer B., Miikkulainen R.: Using Marker-Based Genetic Encoding of Neural Networks To Evolve Finite-State Behavior, Proceedings of the First European Conference on Artificial Life, France), pp.255–262, (1992).Google Scholar
- 17.Gruau F.: Genetic Synthesis of Modular Neural Networks, In S Forrest (Ed.) Genetic Algorithms: Proceedings of the 5th International Conference, Morgan Kaufman, (1993).Google Scholar
- 18.Merril J.W.L., Port R.F.: Fractally Configured Neural Networks, Neural Networks, Vol 4, No.1, pp 53–60, (1991).CrossRefGoogle Scholar
- 19.Kim H.B., Jung S.H., Kim T.G., Park K.H: Fast Learning Method for Back-Propagation Neural Network by Evolutionary Adaptation of Learning Rates, Neurocomputing, vol. 11, no.1, pp. 101–106, (1996).MATHCrossRefGoogle Scholar
- 20.Goldberg D.E.: Genetic Algorithms in Search, Optimization and Machine Learning, Addison-Wesley Publishing Company, Inc., (1989).Google Scholar