Skip to main content
Log in

PNS modules for the synthesis of parallel self-organizing hierarchical neural networks

  • Published:
Circuits, Systems and Signal Processing Aims and scope Submit manuscript

Abstract

The PNS module is discussed as the building block for the synthesis of parallel, self-organizing, hierarchical, neural networks (PSHNNs). The PNS module contains three submodules (units), the first two of which are created as simple neural network constructs and the last of which is a statistical unit. The first two units are fractile in nature, meaning that each such unit may itself consist of a number of parallel PNS modules in a fractile fashion. Through a mechanism of statistical acceptance or rejection of input vectors for classification, the sample space is divided into a number of regions. The input vectors belonging to each region are classified by a dedicated set of PNS modules. This strategy results in considerably higher accuracy of classification and better generalization as compared to previous neural network models. If the delta rule network is used to generate the first two units, each region approximates a linearly separable region. In this sense, the total system becomes similar to a piecewise linear model. The various regions are determined nonlinearly by the first and third units of the PNS modules.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. J. A. Benediktsson, P. H. Swain, and O. K. Ersoy, Neural network approaches versus statistical methods in classification of multisource remote sensing data,IEEE Trans. Geoscience and Remote Sensing, vol. 28, no. 4, pp. 540–552, July 1990.

    Google Scholar 

  2. W. B. Davenport, Jr. and W. L. Root,An Introduction to the Theory of Random Signals and Noise, McGraw-Hill Book Company, New York, 1958.

    Google Scholar 

  3. L. J. Doob,Stochastic Processes, John Wiley & Sons, New York, 1953.

    Google Scholar 

  4. Richard O. Duda and Peter E. Hart,Pattern Classification and Scene Analysis, Wiley-Interscience Publications, New York, 1973.

    Google Scholar 

  5. O. K. Ersoy and S.-W. Deng, Parallel, self-organizing, hierarchical neural networks with continuous inputs and outputs,Proc. Hawaii Int. Conf. System Sciences, HICCS-24, pp. 486–492, Kauai, January, 1991, andIEEE Trans. Neural Networks, vol. 6, no. 4, pp. 1037–1044, September 1995.

    Google Scholar 

  6. O. K. Ersoy and S.-W. Deng, Parallel, self-organizing, hierarchical neural networks with forward backward training,Circuits Systems Signal Processing, vol. 12, no. 2, pp. 223–246, 1993.

    Google Scholar 

  7. O. K. Ersoy and D. Hong, Parallel, self-organizing, hierarchical neural networks,IEEE Trans. Neural Networks, vol. 1, no. 2,pp. 167–178, June 1990.

    Google Scholar 

  8. O. K. Ersoy and D. Hong, Parallel, self-organizing, hierarchical neural networks II,IEEE Trans. Industrial Electronics, Special Issue on Neural Networks, vol. 40, no. 2, pp. 218–227, April 1993.

    Google Scholar 

  9. W. Feller,An Introduction to Probability Theory and Its Applications, John Wiley & Sons, New York, vol. 1, 1957, vol. II, 1967.

    Google Scholar 

  10. K. Fukunaga,Introduction to Statistical Pattern Recognition, Academic Press, New York, 1972.

    Google Scholar 

  11. R. G. Gallagher,Information Theory and Reliable Communication, John Wiley & Sons, New York, 1968.

    Google Scholar 

  12. C. W. Helstrom,Statistical Theory of Signal Detection, 2nd ed., Pergamon Press, New York, 1968.

    Google Scholar 

  13. R. M. Hoffer, M. D. Fleming, L. A. Bartolucci, S. M. Davis, and R. F. Nelson, Digital processing of Landsat MSS and topographic data to improve capabilities for computerized mapping of forest cover types,LARS Technical Report 011579, Laboratory for Applications of Remote Sensing in cooperation with Department of Forest and Natural Resources, Purdue University, West Lafayette, IN 47907.

  14. A. Papoulis,Probability, Random Variables, and Stochastic Processes, McGraw-Hill Book Company, New York, 1965.

    Google Scholar 

  15. J. C. Pemberton and J. J. Vidal, When is the generalized delta rule a learning rule” A physical analogy,Proceedings of ICNN 88, San Diego, CA, pp. 309–315, June 1988.

  16. D. E. Rumelhart and J. L. McClelland,Parallel Distributed Processing, The MIT Press, Cambridge, MA, 1986.

    Google Scholar 

  17. H. Valafar and O. K. Ersoy, Parallel self-organizing, neural networks, Report No. TR-EE 90-56, School of Electrical Engineering, Purdue University, W. Lafayette, IN, 47907, October 1990.

    Google Scholar 

  18. F. Valafar and O. K. Ersoy, A parallel implementation of backpropagation neural network on Mas Par MP-1,Technical report, TR-EE-93-14, School of Electrical Engineering, Purdue University, W. Lafayette, IN 47907, March 1993, and submitted toInternat. J. Parallel Programming.

    Google Scholar 

  19. H. L. Van Trees,Detection, Estimation, and Modulation Theory, Part 1, John Wiley & Sons, New York, 1986.

    Google Scholar 

  20. P. J. Werbos, Backpropagation: Past and Future,Proceedings of ICNN 88, San Diego, CA, pp. 343–353, June 1988.

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Valafar, F., Ersoy, O.K. PNS modules for the synthesis of parallel self-organizing hierarchical neural networks. Circuits Systems and Signal Process 15, 23–50 (1996). https://doi.org/10.1007/BF01187692

Download citation

  • Received:

  • Revised:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF01187692

Keywords

Navigation