Journal of Mathematical Imaging and Vision

, Volume 4, Issue 1, pp 81–110

Computational concepts in classification: Neural networks, statistical pattern recognition, and model-based vision

  • Leonid I. Perlovsky
Article

Abstract

A large number of algorithms have been developed for classification and recognition. These algorithms can be divided into three major paradigms: statistical pattern recognition, neural networks, and model-based vision. Neural networks embody an especially rich field of approaches based on a variety of architectures, learning mechanisms, biological and algorithmic motivations, and application areas. Mathematical analysis of these approaches and paradigms reveals that there are only a few computational concepts permeating all the diverse approaches and serving as a basis for all paradigms and algorithms for classification and recognition.

These basic computational concepts are reviewed in this paper with the purposes of (i) providing a mathematical continuity to seemingly disparate techniques, (ii) establishing basic mathematical limitations on applicability of existing techniques, (iii) discerning fundamental questions facing the classification field, and (iv) searching for directions in which answers to these questions may be found.

Key words

neural networks model-based vision training requirements cognition vision 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    M.L. Minsky and S.A. Papert,Perceptrons, MIT Press: Cambridge, MA, 1988.Google Scholar
  2. 2.
    R.O. Duda and H. Fossum, “Pattern classification by iteratively determined linear and piecewise linear discriminant functions,”IEEE Trans. Electron. Comput. vol. EC-15, pp. 220–232, 1966.Google Scholar
  3. 3.
    Y.C. Ho and A.K. Agrawala, “On pattern classification algorithms: Introduction and survey,”Proc. IEEE, vol. 56, pp. 2101–2114, 1968.Google Scholar
  4. 4.
    D.F. Specht, “Generation of polynomial discriminant functions for pattern recognition,”IEEE Trans. Electron. Comput., vol. EC-16, pp. 308–319, 1967.Google Scholar
  5. 5.
    N.J. Nilsson,Learning Machines, McGraw-Hill: New York, 1965.Google Scholar
  6. 6.
    B. Moore and T. Poggio, “Representation properties of multi-layer feedforward networks,” presented at 1st Annual International Neural Network Society Meeting, Boston, MA, 1988.Google Scholar
  7. 7.
    G.A. Carpenter, “Neural network models for pattern recognition and associative memory,”Neural Networks, vol. 2, pp. 243–257, 1989.Google Scholar
  8. 8.
    I.D. Longstaff and J.F. Cross, “A pattern recognition approach to understanding the multi-layer perception,”Pattern Recognition Letters, vol. 5, pp. 315–319, 1987.Google Scholar
  9. 9.
    K.J. Lang, A.H. Waibel, and G.E. Hinton, “A time-delay neural network architecture for isolated word recognition,”Neural Networks, vol. 3, pp. 23–44, 1990.Google Scholar
  10. 10.
    G.A. Carpenter and S. Grossberg, “A massively parallel architecture for a self-organizing neural pattern recognition machine,”Comput. Vis. Graph. Image Process., vol. 37, pp. 54–115, 1987.Google Scholar
  11. 11.
    A.D. Gordon,Classification, Chapman and Hall: London, 1981.Google Scholar
  12. 12.
    J.J. Hopfield, “Neural networks and physical systems with emergent collective computation abilities,”Proc. Nat. Acad. Sci. U.S.A., vol. 79, pp. 2554–2558, 1982.Google Scholar
  13. 13.
    D.L. Reilly, C. Scofield, C. Elbaum, and L.N. Cooper, “Learning systems architectures composed of multiple learning modules,” inProc. 1st International IEEE Conference on Neural Networks, 1987.Google Scholar
  14. 14.
    D.F. Specht, Probabilistic neural networks,”Neural Networks, vol. 3, pp. 109–118, 1990.Google Scholar
  15. 15.
    T. Kohonen, “Automatic formation of topological maps of patterns in a self organizing system,”Proc. 2nd Scandinavian Conference on Image Analysis, Espoo, Finland, 1981, pp. 214–220.Google Scholar
  16. 16.
    K. Fukushima, “Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position.”Biol. Cybernet., vol. 36, pp. 193–202, 1980.Google Scholar
  17. 17.
    T. Poggio, “Learning, regularization and splines,” presented at 1st Annual International Neural Network Society Meeting, Boston, MA, 1988.Google Scholar
  18. 18.
    P.K. Simpson,Artificial Neural Systems, Pergamon Press: New York, 1990.Google Scholar
  19. 19.
    K. Fukunaga,Introduction to Pattern Recognition, Academic Press: New York, 1990.Google Scholar
  20. 20.
    J.W. Tukey,Exploratory Data Analysis, Addison-Wesley: Reading, MA, 1960.Google Scholar
  21. 21.
    L.I. Perlovsky, “Multiple sensor fusion and neural networks,” DARPA Neural Network Study, MIT Lincoln Laboratory, Lexington, MA, 1987.Google Scholar
  22. 22.
    L.I. Perlovsky and M. M. McManus, “Maximum likelihood neural network for sensor fusion and adaptive classification,”Neural Networks, vol. 4, pp. 89–102, 1991.Google Scholar
  23. 23.
    A. Kaufmann,Introduction to the Theory of Fuzzy Sets, vol. I, Academic Press: New York, 1975.Google Scholar
  24. 24.
    L.I. Perlovsky, “Neural networks for sensor fusion and adaptive classification,” presented at 1st Annual International Neural Network Society Meeting, Boston, MA, 1988.Google Scholar
  25. 25.
    D.M. Titterington, A.F.M. Smith, and U.E. Makov,Statistical Analysis of Finite Mixture Distributions, John Wiley: New York 1985.Google Scholar
  26. 26.
    J. Jaskolski, “Clustering of the Fukunaga's standard data set using ART,” 1993, unpublished.Google Scholar
  27. 27.
    K. Fukunaga and R.R. Hayes, “Statistical classifier design and evaluation,” Purdue University, West Lafayette, IN, Report TR-EE 88-19, 1988.Google Scholar
  28. 28.
    L.I. Perlovsky and T.L. Marzetta, “Estimating a covariance matrix in a sensor fusion problem,”IEEE Trans. Signal Process., vol. 40, pp. 2097–2100, 1992.Google Scholar
  29. 29.
    H. Cramer,Mathematical Methods of Statistics, Princeton University Press: Princeton, NJ, 1963.Google Scholar
  30. 30.
    M.B. Priestley,Spectral Analysis and Time Series, Academic Press: New York, 1981.Google Scholar
  31. 31.
    L.I. Perlovsky, “Frequency estimates for simple oscillating systems under random forcing,” inLinear Algebra in Signals, Systems and Control, B. N. Datta C.R. Johnson, M.A. Kaashoek, R.J. Plemmons, and E.D. Sontag, eds., SIAM Press: Philadelphia PA, 1988.Google Scholar
  32. 32.
    L.I. Perlovsky, “Fuzzy tracking of multipleobjects,” presented at 1st IEEE Workshop on Neural Networks for Signal Processing, Princeton, NJ, 1991.Google Scholar
  33. 33.
    L.I. Perlovsky, and C.P. Plum, “Model based multiple target tracking using MLANS,” inProc. 3rd Biennial Acoustics, Speech, and Signal Processing Miniconference, Boston, MA, 1991.Google Scholar
  34. 34.
    S.S. Blackman,Multiple Target Tracking with Radar Applications, Artech: Norwood, MA, 1986.Google Scholar
  35. 35.
    R.A. Singer, R.G. Sea, and R.B. Housewright, “Derivation and evaluation of improved tracking filters for use in dense multitarget environments,”IEEE Trans. Inform. Theory, vol. IT-20, pp. 423–432, 1974.Google Scholar
  36. 36.
    Y. Bar-Shalom and E. Tse, “Tracking in a cluttered environment with probabilistic data association,”Automatica, vol. 11, pp. 451–460, 1975.Google Scholar
  37. 37.
    N. Chomsky,Knowledge of Language: Its Nature, Origin, and Use. Praeger: New York, 1986.Google Scholar
  38. 38.
    S. Watanabe,Pattern Recognition: Human and Mechanical, John Wiley: New York, 1985.Google Scholar
  39. 39.
    M. Siebert and A.M. Waxman, “Early vision applications of feature-map diffusion-enhancement nets,” presented at 1st Annual International Neural Network Society Meeting, Boston, MA, 1988.Google Scholar

Copyright information

© Kluwer Academic Publishers 1994

Authors and Affiliations

  • Leonid I. Perlovsky
    • 1
  1. 1.Nichols Research CorporationWakefieldUSA

Personalised recommendations