Advertisement

Extending and benchmarking the CasPer algorithm

  • N. K. Treadgold
  • T. D. Gedeon
Neural Networks
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1342)

Abstract

The CasPer algorithm is a constructive neural network algorithm. CasPer creates cascade network architectures in a similar manner to Cascade Correlation. CasPer, however, uses a modified form of the RPROP algorithm, termed Progressive RPROP, to train the whole network after the addition of each new hidden neuron. Previous work with CasPer has shown that it builds networks which generalise better than CasCor, often using less hidden neurons. This work adds two extensions to CasPer. First, an enhancement to the RPROP algorithm, SARPROP, is used to train newly installed hidden neurons. The second extension involves the use of a pool of hidden neurons, each trained using SARPROP, with the best performing selected for insertion into the network. These extensions are shown to result in CasPer producing more compact networks which often generalise better than those produced by the original CasPer algorithm.

Keywords

Neural Network Constructive Cascade RPROP 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    Treadgold, N.K. and Gedeon, T.D. “A Cascade Network Employing Progressive RPROP” Int. Work Conf. on Artificial and Natural Neural Networks, pp. 733–742, 1997.Google Scholar
  2. [2]
    Fahlman, S.E. and Lebiere, C. “The cascade-correlation learning architecture,” Advances in Neural Information Processing, vol. 2, D.S. Touretzky, (Ed.) San Mateo, CA: Morgan Kauffman, pp. 524–532, 1990.Google Scholar
  3. [3]
    J. Hwang, S. You, S. Lay, and I. Jou, “The Cascade-Correlation Learning: A Projection Pursuit Learning Perspective” IEEE Trans. Neural Networks 7(2), pp. 278–289, 1996.CrossRefGoogle Scholar
  4. [4]
    T. Kwok and D. Yeung, “Experimental Analysis of Input Weight Freezing in Constructive Neural Networks” Proc IEEE Int. Conf. On Neural Networks, pp. 511–516, 1993.Google Scholar
  5. [5]
    Treadgold, N.K. and Gedeon, T.D. “A Simulated Annealing Enhancement to Resilient Backpropagation,” Proc. Int. Panel Conf. Soft and Intelligent Computing, Budapest, pp. 293–298, 1996.Google Scholar
  6. [6]
    Riedmiller, M. and Braun, H. “A Direct Adaptive Method for Faster Backpropagation Learning: The RPROP Algorithm,” Proc IEEE Int. Conf. on Neural Networks, pp. 586–591, 1993.Google Scholar
  7. [7]
    Fahlman, S.E. “An empirical study of learning speed in back-propagation networks,” Technical Report CMU-CS-88-162, Carnegie Mellon University, Pittsburgh, PA, 1988.Google Scholar
  8. [8]
    Murphy, P.M. and Aha, D.W. “UCI Repository of machine learning databases,” [http://www.ics.uci.edu/~mlearn/MLRepository.html], Irvine, CA: University of California, Department of Information and Computer Science, 1994.Google Scholar
  9. [9]
    Treadgold, N.K. and Gedeon, T.D. “Extending CasPer: A Regression Survey” Int. Conf. On Neural Information Processing, to appear, 1997.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1997

Authors and Affiliations

  • N. K. Treadgold
    • 1
  • T. D. Gedeon
    • 1
  1. 1.Department of Information Engineering School of Computer Science & EngineeringThe University of New South WalesSydneyAustralia

Personalised recommendations