Associative structures for vision

  • Davide Anguita
  • Giancarlo Parodi
  • Rodolfo Zunino
Article

Abstract

The paper describes how associative techniques can contribute to reducing data dimensionality for image understanding. Associative memories enhance a vision system's robustness by their pattern-completion capability; neural networks compensate for the limited storage capacity of the memory and provide adaptive nonlinear filtering to remove crosstalk noise. Two associative schemata are defined: for visual classification and for stimulus response. The major concern is to preserve general applicability, while ensuring practical effectiveness. Both a theoretical discussion and experimental evidence support the satisfactory results obtained.

Key Words

Robust computer vision associative memories neural networks integrative associative structures stimulus reponse 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    A.V. Oppenheim and R.W. Schafer,Digital Signal Processing, Englewood Cliffs, NJ: Prentice-Hall, 1975.Google Scholar
  2. 2.
    C. Braccini and G. Gambardella, “Form-Invariant Linear Filtering: Theory and Applications,”IEEE Trans. Acou. Speech and Sig. Proc., vol. ASSP-34, no. 6, 1986, pp. 1612–1628.Google Scholar
  3. 3.
    H. Wechsler.Computational Vision, New York: Academic Press, 1990.Google Scholar
  4. 4.
    R.A. Brooks, “Model-Based Three-Dimensional Interpretation of Two-Dimensional Images,”IEEE Trans. Pattern Anal. Mach. Intell., vol. PAMI-5, no. 2, March 1983, pp. 140–150.Google Scholar
  5. 5.
    G. Vernazza, S. Serpico, and S. Dellepiane, “A Knowledge-Based System for Biomedical Image Processing and Recognition,”IEEE Trans. on Circuits and Systems, vol. CAS-34, no. 1, 1987, pp. 1399–1416.Google Scholar
  6. 6.
    W. Forstner and S. Ruwiedel (Eds.)Robust Computer Vision Bonn: Wichmann Press, 1992.Google Scholar
  7. 7.
    C.G.Y. Lau and B. Widrow, “Special Issue on Neural Networks I: Theory and Modeling,”Proc. IEEE, vol. 78, no. 9, September 1990.Google Scholar
  8. 8.
    C.G.Y. Lau and B. Widrow, “Special Issue on Neural Networks II: Analysis, Techniques, and Applications,”Proc. IEEE, vol. 78, no. 10, October 1990.Google Scholar
  9. 9.
    D.E. Rumelhart and J.L. McClelland (Eds.)Parallel Distributed Processing—Explorations in the Microstructure of Cognition, vol. 1–2, Cambridge, MA: MIT Press, 1986.Google Scholar
  10. 10.
    R.P. Lippmann, “An Introduction to Computing with Neural Nets,”IEEE ASSP Mag., April 1987, pp. 4–22.Google Scholar
  11. 11.
    S. Grossberg (Ed.),Neural Networks and Natural Intelligence, Cambridge, MA: MIT Press, 1988.Google Scholar
  12. 12.
    G.A. Carpenter and S. Grossberg, “A Massively Parallel Architecture for a Self-organizing Neural Pattern Recognition Machine,”Comp. Vision, Graphics, and Image Processing, vol. 37, 1987, pp. 54–115.Google Scholar
  13. 13.
    G.A. Carpenter and S. Grossberg, “The ART of Adaptive Pattern Recognition by a Self-organizing Neural Network,”IEEE Computer, vol. 21, no. 3, March 1988, pp. 77–88.Google Scholar
  14. 14.
    E.B. Baum and D. Haussler, “What Size Net Gives Valid Generalization?”Neural Computation, no. 1, 1989, pp. 151–160.Google Scholar
  15. 15.
    V.N. Vapnik and A. Ya. Chervonenkis, “On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities,”Th. Prob. its Appl., vol. 17, no. 2, 1971, pp. 264–280.Google Scholar
  16. 16.
    S. Knerr, L. Personnaz, and G. Dreyfus, “A New Approach to the Design of Neural Network Classifiers and Its Application to the Automatic Recognition of Handwritten Digits,”Proc. Int. Joint Conf. on Neural Networks IJCNN'91, Seattle, WA.Google Scholar
  17. 17.
    Y. Le Cun, B. Boser, J.S. Denker, et al. “Handwritten Digit Recognition with Backpropagation Network,” in D. Touretsky (Ed.)Neural Information Processing Systems, vol. 2, Morgan Kaufmann, 1990.Google Scholar
  18. 18.
    M. Meng and A.C. Kak, “Fast Vision-Guided Mobile Robot Navigation Using Neural Networks,”Proc. IEEE Int. Conf. on Sys., Man, and Cybern., Chicago, IL: IEEE Press, October 1992, pp. 111–116.Google Scholar
  19. 19.
    S.J. Perantonis and P.J.G. Lisboa, “Translation, Rotation, and Scale Invariant Pattern Recognition by High-Order Neural Networks and Moment Classifiers,”IEEE Trans. on Neural Networks, vol. 3, no. 2, March 1992, pp. 241–251.Google Scholar
  20. 20.
    K. Fukushima, S. Miyake, and T. Ito, “Neocognitron: A Neural Network Model for a Mechanism of Visual Pattern Recognition,”IEEE Trans. on Sys. Man and Cybern., vol. SMC-13(5), September/October 1983, pp. 826–834.Google Scholar
  21. 21.
    K. Fukushima and N. Wake, “Improved Neocognitron with Bend-Detecting Cells,”Proc. Int. Joint Conf. on Neural Networks IJCNN'92, Baltimore, MD: IEEE Press, June 1992.Google Scholar
  22. 22.
    D.A. Pomerleau, “Efficient Training of Artificial Neural Networks for Autonomous Navigation,”Neural Computation, no. 3, 1991, pp. 88–97.Google Scholar
  23. 23.
    G.E. Hinton and J.A. Anderson (Eds.),Parallel Models of Associative Memory—(updated edition) New York: Lawrence Erlbaum, 1989.Google Scholar
  24. 24.
    T. Kohonen, “Correlation Matrix Memories,”IEEE Trans. Comput., vol. C-21, 1972, pp. 353–359.Google Scholar
  25. 25.
    T. Kohonen, E. Oja and P. Lehtio, “Storage and Processin of Information in Distributed Associative Memory Systems,” in G.E. Hinton, J.A. Anderson (Eds.)Parallel Models sof Associative Memory (updated edition] New York: Lawrence Erlbaum, 1989.Google Scholar
  26. 26.
    T. Kohonen,Self Organization and Associative Memory, 3rd Ed., Berlin and New York: Springer Verlag, 1990.Google Scholar
  27. 27.
    J.J. Hopfield, “Neural Networks and Physical Systems with Emergent Collective Computational Abilities,”Proc. Natl. Acad. Science (USA), vol. 79, 1982, pp. 2554–2558.Google Scholar
  28. 28.
    J.J. Hopfield, “Neurons with Graded Response Have Collective Computational Properties Like Those to Two-State Neurons,”Proc. Natl. Acad. Science (USA), vol. 81, 1984, pp. 3088–3092.Google Scholar
  29. 29.
    P. Kanerva,Sparse Distributed Memory, Cambridge, MA: MIT Press, 1988.Google Scholar
  30. 30.
    S.S. Yau, “Pattern Recognition by Using an Associative Memory,”IEEE Trans. Comp., 1966, pp. 944–947.Google Scholar
  31. 31.
    N.M. Nasrabadi and W. Li, “Object Recognition by a Hopfield Neural Net,”IEEE Trans. on System, Man, and Cybern., vol. 21, no. 6, November/December 1991, pp. 1523–1535.Google Scholar
  32. 32.
    W. Polzleitner and H. Wechsler, “Selective and Focused Invariant Recognition Using Distributed Associative Memories (DAM),”IEEE Trans. on Pattern Anal. and Machine Intell., vol. PAMI 12-8 1990, pp. 809–814.Google Scholar
  33. 33.
    H. Wechsler and J.L. Zimmermann, “2-D Invariant Object Recognition Using Distributed Associative Memory,”IEEE Trans. Pattern Anal. and Machine Intell., vol. PAMI 10-6, 1988, pp. 811–821.Google Scholar
  34. 34.
    H. Wechsler and J.L. Zimmermann, “Distributed Associative Memory (DMA) for Bin-Picking,”IEEE Trans. Pattern Anal. and Machine Intell., vol. PAMI 11-8, August 1989, pp. 814–822.Google Scholar
  35. 35.
    R.C. Nelson, “Visual Homing Using an Associative Memory,”Proc. Image Understanding Workshop, Los Altos, CA: Morgan Kaufman, 1989, pp. 245–262.Google Scholar
  36. 36.
    H.S. Hong and S.S. Chen, “Character Recognition in a Sparse Distributed Memory,”IEEE Trans. on Sys. Man, and Cybern., vol. 21-3, 1992, pp. 674–678.Google Scholar
  37. 37.
    S. Bottini, “An Algebraic Model of an Associative Noise-like Coding Memory,”Biological Cybernetics, no. 36, 1980, pp. 221–228.Google Scholar
  38. 38.
    A. Diaspro, G.C. Parodi, and R. Zunino, “Classification of Optically-Sectioned Images of Biopolymers by Means of Associative Noise-like Coding Memories,”Studia Biophysica, vol. 139, no. 2, 1990, pp. 69–76.Google Scholar
  39. 39.
    V. Murino, C. Regazzoni, R. Zunino, and G. Foresti, “Associative and Symbolic Algorithms for Viewpoint-Independent Object Recognition,”Proc. IEEE Int. Conf. on Sys. Man and Cybern. SMC-91, Charlottesville, VA, October 1991, pp. 117–122.Google Scholar
  40. 40.
    C.G. Parodi, C. Regazzoni, G. Vernazza, and R. Zunino, “Speeding up Scene Recognition by Using an Associative Noise-like Coding Memory,”Proc. IEEE Int. Phoenix Conf. on Comp. and Comm. IPCCC'91, Phoenix, AZ: IEEE Press, March 1991, pp. 10–16.Google Scholar
  41. 41.
    C. Moneta, G. Vernazza, and R. Zunino, “On the Need for Integrated Approaches to Image Understanding,”European Trans. on Telecomm., vol. 3, no. 5, September/October, 1992, pp. 41–54.Google Scholar
  42. 42.
    D. Gabor, “Associative Holographic Memories,”IBM J Res Dev, vol. 13, 1969, pp. 156–159.Google Scholar
  43. 43.
    A. Borsellino and T. Poggio, “Convolution and Correlation Algebras,”Kybernetik, vol. 10, 1873, pp. 113–122.Google Scholar
  44. 44.
    S. Bottini, “An After-Shannon Measure of the Storage Capacity of an Associative Noise-like Coding Memory,”Biological Cybernetics, no. 59, 1988, p. 151–159.Google Scholar
  45. 45.
    G.C. Parodi, S. Ridella, and R. Zunion, “Using Chaos to Generate Keys for Associative Noise-like Memories,”Neural Networks, vol. 6, 1993, in press.Google Scholar
  46. 46.
    G.C. Parodi, S. Ridella, and R. Zunino, “Distributed Key-Generation Structures for Associative Image Classification,”Proc. IEEE Symp. on Circuits and Systems ISCAS'92, San Diego, CA: IEEE Press, June 1992, pp. 1549–1552.Google Scholar
  47. 47.
    D. Anguita, G.C. Parodi, D. Ponta, and R. Zunino, “Transputer-Based Architectures for Associative Image Classification,”Proc. IEEE Symp. on Parall. and Distr. Proc. SPDP'91, Dallas, TX: IEEE Press, December 1991, pp. 241–248.Google Scholar
  48. 48.
    G. Cybenko, “Approximation by Superpositions of a Sigmoidal Function,”Math. Control Signal and Sys., vol. 2, 1989, p. 303.Google Scholar
  49. 49.
    D.E. Rumelhart, G.H. Hinton, and R.J. Williams, “Learning Internal Representation by Error Propagation,” in D.E. Rumelhart, J.L. McClelland (Eds.)Parallel Distributed Processing—Explorations in the Microstructure of Cognition vol. 1–2, Cambridge, MA: MIT Press, 1986.Google Scholar
  50. 50.
    T. Tollenaere, “SuperSAB: Fast Adaptive Back Propagation with Good Scaling Properties,”Neural Networks, vol. 3, 1990, pp. 561–573.Google Scholar
  51. 51.
    B. Widrow and M.A. Lehr, “30 Years of Adaptive Neural Networks: Perceptron, Madaline, and Backpropagation,”Proc. of the IEEE, vol. 78, no. 9, September 1990, pp. 1415–1442.Google Scholar
  52. 52.
    L. Holstrom and P. Koistinen, “Using Additive Noise in Back-propagation Training,”IEEE Trans. on Neural Networks, vol. 3, no. 1, January 1992, pp. 24–38.Google Scholar
  53. 53.
    K. Matsuoka, “Noise Injection into Inputs in Back-propagation Learning,”IEEE Trans. on Sys., Man, and Cybern., vol. 22, no. 3, May/June 1992, pp. 436–440.Google Scholar

Copyright information

© Kluwer Academic Publishers 1994

Authors and Affiliations

  • Davide Anguita
    • 1
  • Giancarlo Parodi
    • 1
  • Rodolfo Zunino
    • 1
  1. 1.DIBE—Department of Biophysical and Electronic EngineeringUniversity of GenoaItaly

Personalised recommendations