ALT 1993: Algorithmic Learning Theory pp 251-264

# On the VC-dimension of depth four threshold circuits and the complexity of Boolean-valued functions

• Akito Sakurai
Selected Papers Approximate Learning
Part of the Lecture Notes in Computer Science book series (LNCS, volume 744)

## Abstract

We consider the problem of determining VC-dimension ∂3(h) of depth four n-input 1-output threshold circuits with h elements. Best known asymptotic lower bounds and upper bounds are proved, that is, when h → ∞, ∂3(h) is upper bounded by ((h2/3)+nh(log h)(1+o(1))) and lower bounded by (1/2)((h2/4)+nh)(log h)(1 − o(1)). We also consider the problem of determining complexity c3(N) of Boolean-valued functions defined on N-pointsets in Rn, measured by the number of threshold elements, with which we can construct a depth four circuit to realize the functions. We also show the best known upper and lower bounds, that is, when N → ∞, the complexity is upper bounded by$$\sqrt {16\left( {{N \mathord{\left/{\vphantom {N {\log N}}} \right.\kern-\nulldelimiterspace} {\log N}}} \right)\left( {1 + o(1)} \right) + 4n^2 } - 2n$$ and lower bounded by $$\sqrt {6\left( {{N \mathord{\left/{\vphantom {N {\log N}}} \right.\kern-\nulldelimiterspace} {\log N}}} \right)\left( {1 + o(1)} \right)\left( {{9 \mathord{\left/{\vphantom {9 4}} \right.\kern-\nulldelimiterspace} 4}} \right)n^2 } - \left( {{3 \mathord{\left/{\vphantom {3 2}} \right.\kern-\nulldelimiterspace} 2}} \right)n$$

## References

1. 1.
Baum, E. B.: On the capabilities of multilayer perceptrons, Journal of Complexity, vol. 4, 193–215 (1988).Google Scholar
2. 2.
Baum, E.B., and D. Haussler: What size net gives valid generalization?, Neural Computation, vol.1, 151–160 (1989).Google Scholar
3. 3.
Blumer, A., A. Ehrenfeucht, D. Haussler, and M. K. Warmuth: Learnability and the Vapnik-Chervonenkis Dimension, Journal of the ACM, vol.36, no.4, 929–965 (Oct. 1989).
4. 4.
Cover T. M.: Geometrical and statistical properties of systems of linear inequalities with applications in pattern recognition, IEEE Transactions on Electronic Computers, vol. 14, 326–334 (1965).Google Scholar
5. 5.
Lupanov, O. B.: Circuits using threshold elements, Soviet Physics-Doklady, vol.17, no.2, 91–93 (1972), (translated from Doklady Akademii Nauk SSSR, vol. 202, no.6, 1288–1291 (Feb. 1972)).Google Scholar
6. 6.
Maass, W.: Bounds for the computational power and learning complexity of analog neural nets, preprint (1992).Google Scholar
7. 7.
Nechiporuk, E.I.: The synthesis of networks from threshold elements, Automation Express, vol.7, no.1, 35–39; no.2, 27–32 (1964), (translated from Probl. Kibern., no.11, 49–62 (April 1964)).Google Scholar
8. 8.
Sakurai, A.: n-h-1 networks store no less n·h+1 examples but sometimes no more, Proceedings of IJCNN92, III-936–III-941 (June 1992).Google Scholar
9. 9.
Sakurai, A.: Tighter Bounds of the VC-Dimension of Three-layer Networks, to be presented at WCNN93 (1993).Google Scholar