Long-Term Memory Neural Circuits, Fast and Precise
This chapter derives a standard memory cell, that is, a cell that is arranged to interface with other such cells to constitute a memory word of long-term memory. A standard memory cell contains the necessary gates to control signals to and from a memory element; each standard memory cell has its own individual memory element.
A word of long-term memory is assumed enabled by Rin and Lin (Right in and Left in) signals; an associative match is signaled by the emergence of a signal at Rout and Lout (Right out and Left out). Cues or attributes are applied from the direction of conscious short-term memory (STM), via a cue editor, and if there are matches, the contents of the words are returned toward conscious STM (via a recall editor).
Multiple matches are common when cues are few in number, necessitating a system of multiple match resolution. This is accomplished in part by using gates to block additional matches once a particular match is located. Memory search is then resumed using the same cues, but to support multiple match resolution, a simulated qubit in the form of a toggle is reset to zero and serves to block matching words already returned.
Neural circuits for reading long-term memory are also used to write, that is, memorize images that are taken from conscious STM. Memorization is modeled as occurring automatically once certain conditions are met, such as, the image is not already memorized, and the image has occurred in consciousness a given number of times. A simplified filter circuit is suggested for images that appear a couple of times, although in practice, several rehearsals would be more realistic for memorization.
A multiwrite system is necessary to ensure that each newly memorized image goes into only one blank available word of long-term memory. This is accomplished below with a stack of long-term memory elements all cleared to false except one, where the new memory will go. Once the memory is in place, the memory element is set to true and stays true indefinitely, just like the memory itself.
This chapter compares memorization to learning, with an eye to explaining the amazing abilities of savants who apparently memorize huge amounts of information. Learning filters are explained for state machine learning, in which sequences are recalled from subconscious long-term memory without having to pass through conscious STM. Savant memorization is proposed, in fact, to be learning, which is accomplished with the aid of special learning filters. An example filter is given that permits subconscious learning with a very low count of rehearsals.