Natural Computing

, Volume 5, Issue 1, pp 15–42

Finite Memory Loading in Hairy Neurons

Article

Abstract

This paper presents a method to expand the basins of stable patterns in associative memory. It examines fully-connected associative memory geometrically and translate the learning process into an algebraic optimization procedure. It finds that locating all the patterns at certain stable corners of the neurons’ hypercube as far from the decision hyperplanes as possible can produce excellent error tolerance. It then devises a method based on this finding to develop the hyperplanes. This paper further shows that this method leads to the hairy model, or the deterministic analogue of the Gibb’s free energy model. Through simulations, it shows that this method gives better error tolerance than does the Hopfield model and the error-correction rule in both synchronous and asynchronous modes.

Keywords

associative memory error-correction rule Gibb’s free energy hairy model Hopfield network Little model music perception neural network spin glass model 

Abbreviations

AM

associative memory

EAM

expanded associative memory

ECR

error-correction rule

LM

Little model

RK

Runge-Kutta method

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer 2006

Authors and Affiliations

  1. 1.Department of Computer Science and Information EngineeringNational Taiwan UniversityTaipeiROC
  2. 2.M.D., Health CenterNational Taiwan Normal UniversityTaipeiROC

Personalised recommendations