Skip to main content

Advertisement

SpringerLink
Log in
Menu
Find a journal Publish with us
Search
Cart
Book cover

IAPR Workshop on Artificial Neural Networks in Pattern Recognition

ANNPR 2012: Artificial Neural Networks in Pattern Recognition pp 24–35Cite as

  1. Home
  2. Artificial Neural Networks in Pattern Recognition
  3. Conference paper
Incremental Learning by Message Passing in Hierarchical Temporal Memory

Incremental Learning by Message Passing in Hierarchical Temporal Memory

  • Davide Maltoni22 &
  • Erik M. Rehn23 
  • Conference paper
  • 1446 Accesses

  • 2 Citations

Part of the Lecture Notes in Computer Science book series (LNAI,volume 7477)

Abstract

Hierarchical Temporal Memory is a biologically-inspired framework that can be used to learn invariant representations of patterns. Classical HTM learning is mainly unsupervised and once training is completed the network structure is frozen, thus making further training quite critical. In this paper we develop a novel technique for HTM (incremental) supervised learning based on error minimization. We prove that error backpropagation can be naturally and elegantly implemented through native HTM message passing based on Belief Propagation. Our experimental results show that a two stage training composed by unsupervised pre-training + supervised refinement is very effective. This is in line with recent findings on other deep architectures.

Keywords

  • HTM
  • Deep architectures
  • Backpropagation
  • Incremental learning

Download conference paper PDF

References

  1. George, D., Hawkins, J.: A Hierarchical Bayesian Model of Invariant Pattern Recognition in the Visual Cortex. In: IJCNN (2005)

    Google Scholar 

  2. Ranzato, M., et al.: Unsupervised Learning of Invariant Feature Hierarchies with Applications to Object Recognition. In: CVPR (2007)

    Google Scholar 

  3. Bengio, Y.: Learning Deep Architectures for AI. Foundations and Trends in Machine Learning 2(1) (2009)

    Google Scholar 

  4. Jarrett, K., Kavukcuoglu, K., Ranzato, M., LeCun, Y.: What is the Best Multi-Stage Architecture for Object Recognition? In: ICCV (2009)

    Google Scholar 

  5. George, D., Hawkins, J.: Towards a Mathematical Theory of Cortical Micro-circuits. PLoS Computational Biology 5(10) (2009)

    Google Scholar 

  6. Lee, T.S., Mumford, D.: Hierarchical Bayesian inference in the visual cortex. Journal of the Optical Society of America 20(7), 1434–1448 (2003)

    CrossRef  Google Scholar 

  7. Maltoni, D.: Pattern Recognition by Hierarchical Temporal Memory. DEIS TR (April 2011), http://bias.csr.unibo.it/maltoni/HTM_TR_v1.0.pdf

  8. Pearl, J.: Probabilistic Reasoning in Intelligent Systems. Morgan-Kaufmann (1988)

    Google Scholar 

  9. George, D.: How the Brain Might Work: A Hierarchical and Temporal Model for Learning and Recognition. Ph.D. thesis, Stanford University (2008)

    Google Scholar 

  10. Maltoni, D., Rehn, E.M.: Incremental Learning by Message Passing in Hierarchical Temporal Memory. DEIS TR (May 2012), http://bias.csr.unibo.it/maltoni/HTM_HSR_TR_v1.0.pdf

Download references

Author information

Authors and Affiliations

  1. Biometric System Laboratory, DEIS, University of Bologna, Italy

    Davide Maltoni

  2. Bernstein Center for Computational Neuroscience, Berlin, Germany

    Erik M. Rehn

Authors
  1. Davide Maltoni
    View author publications

    You can also search for this author in PubMed Google Scholar

  2. Erik M. Rehn
    View author publications

    You can also search for this author in PubMed Google Scholar

Editor information

Editors and Affiliations

  1. Fondazione Bruno Kessler (FBK), 38123, Trento, Italy

    Nadia Mana

  2. Institute of Neural Information Processing, University of Ulm, 89069, Ulm, Germany

    Friedhelm Schwenker

  3. Dipartimento di Ingegneria dell’Informazione, Università di Siena, 53100, Siena, Italy

    Edmondo Trentin

Rights and permissions

Reprints and Permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Maltoni, D., Rehn, E.M. (2012). Incremental Learning by Message Passing in Hierarchical Temporal Memory. In: Mana, N., Schwenker, F., Trentin, E. (eds) Artificial Neural Networks in Pattern Recognition. ANNPR 2012. Lecture Notes in Computer Science(), vol 7477. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33212-8_3

Download citation

  • .RIS
  • .ENW
  • .BIB
  • DOI: https://doi.org/10.1007/978-3-642-33212-8_3

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-33211-1

  • Online ISBN: 978-3-642-33212-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Share this paper

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • The International Association for Pattern Recognition

    Published in cooperation with

    http://www.iapr.org/

Search

Navigation

  • Find a journal
  • Publish with us

Discover content

  • Journals A-Z
  • Books A-Z

Publish with us

  • Publish your research
  • Open access publishing

Products and services

  • Our products
  • Librarians
  • Societies
  • Partners and advertisers

Our imprints

  • Springer
  • Nature Portfolio
  • BMC
  • Palgrave Macmillan
  • Apress
  • Your US state privacy rights
  • Accessibility statement
  • Terms and conditions
  • Privacy policy
  • Help and support

167.114.118.210

Not affiliated

Springer Nature

© 2023 Springer Nature