Abstract
One of the most powerful aspects of neural networks is their ability to adapt to problems by changing their interconnection strengths according to a predetermined learning rule. On the other hand, one of the main drawbacks of neural networks is the lack of knowledge for determining the topology of the network, that is, the number of layers and number of neurons per layer. A special class of neural networks tries to overcome this problem by letting the network also automatically adapt its topology to the problem. These are the so called ontogenic neural networks. Other potential advantages of ontogenic neural networks are improved generalization, implementation optimization (size and/or execution speed), and the avoidance of local minima.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
E. Alpaydin. Grow-and-learn: An incremental method for category learning. In Proc. INNC ’90, volume 2, pages 761–764, Dordrecht, The Netherlands, 1990. Kluwer.
T. Ash. Dynamic node creation in backpropagation networks. Connection Science, 1(4):365–375, 1989.
R. L. Barron. Learning networks improve computer-aided prediction and control. Computer Design, pages 65–70, August 1975.
E. B. Baum and K. J. Lang. Constructing hidden units using examples and queries. In R. P. Lippmann et al., editor, NIPS 3. Morgan Kaufmann, 1991.
I. Bellido and G. Fernández. Backpropagation growing networks: Towards local minima elimination. In A. Prieto, editor, Artificial Neural Networks; Proc. IWANN ’91, pages 130–135, Heidelberg, Germany, 1991. Springer.
B. Bonnlander and M. C. Mozer. Latticed rbf networks: An alternative to constructive methods. In NIPS 5, 1993.
G. Deffuant. Neural units recruitment algorithm for generation of decision trees. In Proc. IJCNN’90 - San Diego, volume I, pages 637–642, Ann Arbor, MI, 1990. IEEE Neural Networks Council; Edward Brothers.
J. Diederich. Connectionist recruitment learning. In Proc. of the 8th European Conference on A.I., pages 351–356, 1988.
S. E. Fahlman and C. Lebiere. The cascade-correlation learning architecture. Technical Report CMU-CS-90–100, School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, 1990.
S. E. Fahlman. The recurrent cascade-correlation architecture. Technical Report CMU-CS-91–100, School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, May 17 1991.
E. Fiesler. Partially connected ontogenic high order neural networks. Technical Report 92–02, IDIAP, Martigny, Switzerland, August 1992.
M. Frean. The upstart algorithm: A method for constructing and training feedforward neural networks. Neural Computation, 2(2):198–209, 1990.
B. Fritzke. Let it grow - self-organizing feature maps with problem dependent cell structure. In T. Kohonen et al., editor, Artificial Neural Network.; (ICANN-91), volume 1, pages 403–408, Amsterdam, The Netherlands, 1991. North-Holland; Elsevier Science Publishing Company B.V.
B. Fritzke. Unsupervised clustering with growing cell structures. In Proc. IJCNN ’91 Seattle, volume II, pages 531–536, 1991.
B. Fritzke and P. Wilke. Flexmap - a neural network with linear time and space complexity for the traveling salesman problem. In Proc. IJCNN-91, 1991.
M. Golea and M. Marchand. A growth algorithm for neural network decision trees. EuroPhysics Letters, 12(3):205–210,1990.
M. Hagiwara. Novel back propagation algorithm for reduction of hidden units and acceleration of convergence using artificial selection. In Proc. IJCNN ’90 San Diego, volume I, pages 625–630, Ann Arbor, MI, 1990. IEEE Neural Networks Council; Edward Brothers.
S. J. Hanson. Meiosis networks. In D. S. Touretzky, editor, NIPS 2, pages 533–541, San Mateo, CA, 1990. Morgan Kaufmann.
B. Hassibi and D. G. Stork. Second order derivatives for network pruning: Optimal brain surgeon. In NIPS 5, 1993.
Y. Hirose, K. Yamashita, and S. Hijaya. Back-propagation algorithm which varies the number of hidden units. Neural Networks, 4:61–66, 1991.
V. Honavar and L. Uhr. A network of neuron-like units that learns to perceive by generation as well as reweighting of its links. In D. Touretzky et al., editor, Proc. of the 1988 Connectionist Models Summer School, pages 472- 484, San Mateo, California, 1988. Morgan Kaufmann.
V. Honavar and L. Uhr. Generative learning structures and processes for generalized connectionist networks. Connection Science, 1:139–159, 1989.
A. G. Ivakhnenko. The group method of data handling - a rival of stochastic approximation. Soviet Automatic Control, 1:43–55, 1968.
V. Kadirkamanathan and M. Niranjan. Application of an architecturally dynamic network for speech pattern classification. In Proc. of the Institute of Acoustics, volume 14, 1992.
V. Kadirkamanathan and M. Niranjan. A function estimation approach to sequential learning with neural networks. Technical Report CUED/FINFENG/ TR.111, Cambridge University Engineering Department, Cambridge, UK, September 13 1992.
J. Klotz and E. Fiesler. Ontogenic neural networks. Expert Systems. (to be submitted).
A. Lansner and Ö. Ekeberg. An associative network solving the “4-bit ADDER problem”. In M. Caudill and C. Butler, editors, Proc. ICNN, volume II, pages 549–556, San Diego, CA, 1987. SOS Printing.
Y. Le Cun, J. S. Denker, and S. A. Solla. Optimal brain damage. In D. S. Touretzky, editor, NIPS 2, pages 598–605, San Mateo, CA, 1990. Morgan Kaufmann.
M. Mézard and J.-P. Nadal. Learning in feedforward layered networks: The tiling algorithm. J. of Physics: A, 22(12):2191–2203, 1989.
J.-P. Nadal. Study of a growth algorithm for a feedforward neural network. Int. J. of Neural System8. 1(1):55–59, 1989.
D. L. Reilly, L. N. Cooper, and C. Elbaum. A neural model for category learning. Biological Cybernetics, 45:35–41, 1982.
M. Ring. Sequence learning with incremental high-order neural networks. CSE technical Report AI 93–193, Dept. of Computer Sc., University of Texas at Austin, January 1993.
P. Ruján and M. Marchand. Learning by minimizing resources in neural networks. Complex Systems, 3:229–241, 1989.
J. A. Sirat and J. P. Nadal. Neural trees: A new tool for classification. Network; Computation in Neural Systems, 1:423–438,1990.
M. F. Tenorio and W.-T. Lee. Self organizing neural networks for the identification problem. In D. S. Touretzky, editor, NIPS 1, pages 57–64, San Mateo, CA, 1989. Morgan Kaufmann.
H. H. Thodberg. Improving generalization of neural networks through pruning. Int. J. of Neural Systems, 1(4):317–326,1991.
W. X. Wen, H. Liu, and A. Jennings. Self-generating neural networks. In Proc. IJCNN - Baltimore. 1992.
R. Zollner, H. J. Schmitz, F. Wünch, and U. Krey. Fast generating algorithm for a general three-layer perceptron. Neural Networks, 5(5):771–777, September-October 1992.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1994 Springer-Verlag London Limited
About this paper
Cite this paper
Fiesler, E. (1994). Comparative Bibliography of Ontogenic Neural Networks. In: Marinaro, M., Morasso, P.G. (eds) ICANN ’94. ICANN 1994. Springer, London. https://doi.org/10.1007/978-1-4471-2097-1_188
Download citation
DOI: https://doi.org/10.1007/978-1-4471-2097-1_188
Published:
Publisher Name: Springer, London
Print ISBN: 978-3-540-19887-1
Online ISBN: 978-1-4471-2097-1
eBook Packages: Springer Book Archive