Learning translation invariant recognition in a massively parallel networks

  • Geoffrey E. Hinton
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 258)


One major goal of research on massively parallel networks of neuron-like processing elements is to discover efficient methods for recognizing patterns. Another goal is to discover general learning procedures that allow networks to construct the internal representations that are required for complex tasks. This paper describes a recently developed procedure that can learn to perform a recognition task. The network is trained on examples in which the input vector represents an instance of a pattern in a particular position and the required output vector represents its name. After prolonged training, the network develops canonical internal representations of the patterns and it uses these canonical representations to identify familiar patterns in novel positions.


Input Vector Little Mean Square Learning Procedure Hide Unit Output Unit 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Ackley, D. H., Hinton, G. E., Sejnowski, T. J. (1985). A learning algorithm for Boltzmann machines. Cognitive Science, 9, 147–169.Google Scholar
  2. Alspector, J. & Allen, R. B. (1987). A neuromorphic VLSI learning system. In P. Loseleben (Ed.), Advanced Research in VLSI: Proceedings of the 1987 Stanford Conference. Cambridge, Mass.: MIT Press.Google Scholar
  3. Elman, J. L. and Zipser, D. (1987). Discovering the hidden structure of speech (Tech. Rep.). Institute for Cognitive Science Technical Report No. 8701. University of California, San Diego.Google Scholar
  4. Feldman, J. A. & Ballard, D. H. (1982). Connectionist models and their properties. Cognitive Science, 6, 205–254.Google Scholar
  5. Hinton, G. E., & Anderson, J. A. (1981). Parallel models of associative memory. Hillsdale, NJ: Erlbaum.Google Scholar
  6. Hinton, G. E. (1987). Learning to recognize shapes in a parallel network. In M. Imbert (Ed.), Proceedings of the 1986 Fyssen Conference. Oxford: Oxford University Press.Google Scholar
  7. Le Cun, Y. (1985). A learning scheme for asymmetric threshold networks. Proceedings of Cognitiva 85. Paris, France.Google Scholar
  8. Minsky, M. & Papert, S. (1969). Perceptrons. Cambridge, Mass: MIT Press.Google Scholar
  9. Newell, A. (1980). Physical symbol systems. Cognitive Science, 4, 135–183.Google Scholar
  10. Parker, D. B. (April 1985). Learning-logic (Tech. Rep.). TR-47, Sloan School of Management, MIT, Cambridge, Mass.Google Scholar
  11. Plaut, D. C., Nowlan, S. J., & Hinton, G. E. (June 1986). Experiments on learning by back-propagation (Tech. Rep. CMU-CS-86-126). Pittsburgh PA 15213: Carnegie-Mellon UniversityGoogle Scholar
  12. Plaut, D. C. and Hinton, G. E. (1987). Learning sets of filters using back-propagation. Computer Speech and Language,.Google Scholar
  13. Rosenblatt, F. (1962). Principles of neurodynamics. New York: Spartan Books.Google Scholar
  14. Rumelhart, D. E., McClelland, J. L., & the PDP research group. (1986). Parallel distributed processing: Explorations in the microstructure of cognition. Volume I., Cambridge, MA: Bradford Books.Google Scholar
  15. Rumelhart, D. E., Hinton, G. E., & Williams, R. J. (1986). Learning internal representations by back-propagating errors. Nature, 323, 533–536.Google Scholar
  16. Rumelhart, D. E., Hinton, G. E., & Williams, R. J. (1986). Learning internal representations by error propagation. In D. E. Rumelhart, J. L. McClelland, & the PDP research group (Eds.), Parallel distributed processing: Explorations in the microstructure of cognition. Cambridge, MA: Bradford Books.Google Scholar
  17. Saund, E. (1986). Abstraction and representation of continuous variables in connectionist networks. Proceedings of the Fifth National Conference on Artificial Intelligence. Los Altos, California, Morgan Kauffman.Google Scholar
  18. Sejnowski, T. J. & Rosenberg C. R. (1986). NETtalk: A parallel network that learns to read aloud Technical Report 86-01. Department of Electrical Engineering and Computer Science, Johns Hopkins University, Baltimore, MD.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1987

Authors and Affiliations

  • Geoffrey E. Hinton
    • 1
  1. 1.Computer Science DepartmentCarnegie-Mellon UniversityPittsburghUSA

Personalised recommendations