Advertisement

Long-Term Memory Neural Circuits, Fast and Precise

  • John Robert Burger
Chapter
Part of the Springer Series in Cognitive and Neural Systems book series (SSCNS, volume 6)

Abstract

This chapter derives a standard memory cell, that is, a cell that is arranged to interface with other such cells to constitute a memory word of long-term memory. A standard memory cell contains the necessary gates to control signals to and from a memory element; each standard memory cell has its own individual memory element.

A word of long-term memory is assumed enabled by Rin and Lin (Right in and Left in) signals; an associative match is signaled by the emergence of a signal at Rout and Lout (Right out and Left out). Cues or attributes are applied from the direction of conscious short-term memory (STM), via a cue editor, and if there are matches, the contents of the words are returned toward conscious STM (via a recall editor).

Multiple matches are common when cues are few in number, necessitating a system of multiple match resolution. This is accomplished in part by using gates to block additional matches once a particular match is located. Memory search is then resumed using the same cues, but to support multiple match resolution, a simulated qubit in the form of a toggle is reset to zero and serves to block matching words already returned.

Neural circuits for reading long-term memory are also used to write, that is, memorize images that are taken from conscious STM. Memorization is modeled as occurring automatically once certain conditions are met, such as, the image is not already memorized, and the image has occurred in consciousness a given number of times. A simplified filter circuit is suggested for images that appear a couple of times, although in practice, several rehearsals would be more realistic for memorization.

A multiwrite system is necessary to ensure that each newly memorized image goes into only one blank available word of long-term memory. This is accomplished below with a stack of long-term memory elements all cleared to false except one, where the new memory will go. Once the memory is in place, the memory element is set to true and stays true indefinitely, just like the memory itself.

This chapter compares memorization to learning, with an eye to explaining the amazing abilities of savants who apparently memorize huge amounts of information. Learning filters are explained for state machine learning, in which sequences are recalled from subconscious long-term memory without having to pass through conscious STM. Savant memorization is proposed, in fact, to be learning, which is accomplished with the aid of special learning filters. An example filter is given that permits subconscious learning with a very low count of rehearsals.

Keywords

Associative Memory Memory Search Memory Element False Recall Magnetic Resonance Signal 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Pagiamtzis K, Sheikholeslami A (2006) Content-addressable memory (CAM) circuits and architectures: a tutorial and survey, IEEE J Solid State Circuits 41(3):712–727CrossRefGoogle Scholar
  2. 2.
    Burger JR (2009) Human memory modeled with standard analog and digital circuits: inspiration for man-made computers. Wiley, Hoboken, NJCrossRefGoogle Scholar
  3. 3.
    Huettel SA, Song AW, McCarthy G (2004) Functional magnetic resonance imaging. Siniauer Associates, Inc, Sunderland, MA, pp 162–170Google Scholar
  4. 4.
    Ogawa S, Lee TM, Nayuak AS, Glynn P (1990) Oxygenation-sensitive contrast in magnetic resonance image of rodent brain at high magnetic fields. Magn Reson Med 14(1):68–78PubMedCrossRefGoogle Scholar
  5. 5.
    Burger JR (2011) Qubits underlie gifted savants. NeuroQuantology 9:351–360Google Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  • John Robert Burger
    • 1
  1. 1.VenetaUSA

Personalised recommendations