Abstract
Up to this point we considered only the saturated linear activation function. In this chapter, we investigate the computational power of networks with sigmoid activation functions, such as those widely considered in the neural network literature, e.g.,
or
. In Chapter 10 we will see that a large class of activation functions, which also includes the sigmoid, yields networks whose computational power is bounded from above by P/poly. In this chapter we obtain a lower bound on the computational power of sigmoidal networks. We prove that there exists a universal architecture of sigmoidal neurons that can be used to compute any recursive function, with exponential slowdown. Our proof techniques can be applied to a much more general class of “sigmoidal-like” activation functions, suggesting that Turing universality is a common property of recurrent neural network models. In conclusion, the computational capabilities of sigmoidal networks are located in between Turing machines and advice Turing machines.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1999 Springer Science+Business Media New York
About this chapter
Cite this chapter
Siegelmann, H.T. (1999). Universality of Sigmoidal Networks. In: Neural Networks and Analog Computation. Progress in Theoretical Computer Science. Birkhäuser, Boston, MA. https://doi.org/10.1007/978-1-4612-0707-8_7
Download citation
DOI: https://doi.org/10.1007/978-1-4612-0707-8_7
Publisher Name: Birkhäuser, Boston, MA
Print ISBN: 978-1-4612-6875-8
Online ISBN: 978-1-4612-0707-8
eBook Packages: Springer Book Archive