Abstract
Neural oscillations can enhance feature recognition (Azouz and Gray Proceedings of the National Academy of Sciences of the United States of America, 97, 8110–8115 2000), modulate interactions between neurons (Womelsdorf et al. Science, 316, 1609–01612 2007), and improve learning and memory (Markowska et al. The Journal of Neuroscience, 15, 2063–2073 1995). Numerical studies have shown that coherent spiking can give rise to windows in time during which information transfer can be enhanced in neuronal networks (Abeles Israel Journal of Medical Sciences, 18, 83–92 1982; Lisman and Idiart Science, 267, 1512–1515 1995, Salinas and Sejnowski Nature Reviews. Neuroscience, 2, 539–550 2001). Unanswered questions are: 1) What is the transfer mechanism? And 2) how well can a transfer be executed? Here, we present a pulse-based mechanism by which a graded current amplitude may be exactly propagated from one neuronal population to another. The mechanism relies on the downstream gating of mean synaptic current amplitude from one population of neurons to another via a pulse. Because transfer is pulse-based, information may be dynamically routed through a neural circuit with fixed connectivity. We demonstrate the transfer mechanism in a realistic network of spiking neurons and show that it is robust to noise in the form of pulse timing inaccuracies, random synaptic strengths and finite size effects. We also show that the mechanism is structurally robust in that it may be implemented using biologically realistic pulses. The transfer mechanism may be used as a building block for fast, complex information processing in neural circuits. We show that the mechanism naturally leads to a framework wherein neural information coding and processing can be considered as a product of linear maps under the active control of a pulse generator. Distinct control and processing components combine to form the basis for the binding, propagation, and processing of dynamically routed information within neural pathways. Using our framework, we construct example neural circuits to 1) maintain a short-term memory, 2) compute time-windowed Fourier transforms, and 3) perform spatial rotations. We postulate that such circuits, with automatic and stereotyped control and processing of information, are the neural correlates of Crick and Koch’s zombie modes.
Similar content being viewed by others
1 Introduction
Accumulating experimental evidence implicates coherent activity as an important element of cognition. Since their discovery (Gray et al. 1989), gamma band oscillations have been demonstrated to exist in hippocampus (Bragin et al. 1995; Csicsvari et al. 2003; Colgin et al. 2009), visual cortex (Gray et al. 1989; Livingstone 1996; Womelsdorf et al. 2007), auditory cortex (Brosch et al. 2002), somatosensory cortex (Bauer et al. 2006), parietal cortex (Pesaran et al. 2002; Buschman and Miller 2007; Medendorp et al. 2007), various nodes of the frontal cortex (Buschman and Miller 2007; Gregoriou et al. 2009; Sohal et al. 2009), amygdala and striatum (Popescu et al. 2009). Gamma oscillations sharpen orientation (Azouz and Gray 2000) and contrast (Henrie and Shapley 2005) tuning in V1, and speed and direction tuning in MT (Liu and Newsome 2006). Attention has been shown to enhance gamma oscillation synchronization in V4, while decreasing low-frequency synchronization (Fries et al. 2001; Fries et al. 2008) and to increase synchronization between V4 and FEF (Gregoriou et al. 2009), LIP and FEF (Buschman and Miller 2007), V1 and V4 (Bosman et al. 2012), and MT and LIP (Saalmann et al. 2007); Interactions between sender and receiver neurons are improved when consistent gamma-phase relationships exist between two communicating sites (Womelsdorf et al. 2007).
Theta-band oscillations have been shown to be associated with visual spatial memory (O’Keefe 1993; Buzsáki 2002), where neurons encoding the locations of visual stimuli and an animal’s own position have been identified (O’Keefe 1993; Skaggs et al. 1996). Additionally, loss of theta gives rise to spatial memory deficits (Winson 1978) and pharmacologically enhanced theta improves learning and memory (Markowska et al. 1995).
These experimental investigations of coherence in and between distinct brain regions have informed the modern understanding of information coding in neural systems (Quian Quiroga and Panzeri 2013; Kumar et al. 2010). Understanding information coding is crucial to understanding how neural circuits and systems bind sensory signals into internal mental representations of the environment, process internal representations to make decisions, and translate decisions into motor activity.
Classically, coding mechanisms have been shown to be related to neural firing rate (Adrian and Zotterman 1926), population activity (Hubel and Wiesel 1965, 1968; Kaissling and Priesner 1970), and spike timing (Bair and Koch 1996). Firing rate (Adrian and Zotterman 1926) and population codes (Knight 1972, 2000; Sirovich et al. 1999; Gerstner 1995; Brunel and Hakim 1999) are two different ways for a neural system to average spike number to represent graded stimulus information, with population codes capable of faster and more accurate processing since averaging is performed across many fast responding neurons. Thus population and temporal codes are capable of making use of the sometimes millisecond accuracy (Bair and Koch 1996; Butts et al. 2007; Varga et al. 2012) of spike timing to represent signal dynamics.
Although classical mechanisms serve as their underpinnings, new mechanisms have been proposed for short-term memory (Lisman and Idiart 1995; Jensey and Lisman 2005; Goldman 2008), information transfer via spike coincidence (Abeles 1982; König et al. 1996; Fries 2005) and information gating (Salinas and Sejnowski 2001; Fries 2005; Rubin and Terman 2004; Jahnke et al. 2014a, 2014b) that rely on gamma- and theta-band oscillations. For example, the Lisman-Idiart interleaved-memory (IM) model (Lisman and Idiart 1995), and Fries’s communication-through-coherence (CTC) model (Fries 2005) both make use of the fact that synchronous input can provide windows in time during which spikes may be more easily transferred through a neural circuit. Thus, neurons firing coherently can transfer their activity quickly downstream. Additionally, synchronous firing has been used in Abeles’s synfire network (Abeles 1982; König et al. 1996; Diesmann et al. 1999; Kremkow et al. 2010; Kistler and Gerstner 2002; Bienenstock 1995) giving rise to volleys of propagating spikes.
The precise mechanism and the extent to which the brain can make use of coherent activity to transfer information have remained unclear. Previous theoretical and experimental studies have largely focused on feedforward, synfire chains (Litvak et al. 2003; Diesmann et al. 1999; Reyes 2003; Kistler and Gerstner 2002; Jahnke et al. 2013; Feinerman and Moses 2006). These studies have shown that it is possible to transfer volleys of action potentials stably from layer to layer, but that the waveform tends to an attractor with fixed amplitude. Therefore, in these models, although a volley can propagate, graded information, in the form of a rate amplitude cannot. Other numerical work has shown that it is possible to transfer firing rates through layered networks when background current noise is sufficient to effectively keep the network at threshold (van Rossum et al. 2002). The disadvantage of this method is that there is no mechanism to control the flow of firing rate information other than increasing or decreasing background noise. Recently, it has been shown that external gating, similar, in principle, to that used in the IM model and to the gating that we introduce below, can stabilize the propagation of fixed amplitude pulses and act as an external factor to control pulse propagation (Jahnke et al. 2014a, b).
In the Methods section, we show that information contained in the amplitude of a synaptic current may be exactly transferred from one neuronal population to another, as long as well-timed current pulses are injected into the populations. This mechanism is distinct from the synfire chains mentioned above that can only transfer action potential volleys of fixed amplitude, and in contrast to van Rossum et al. (2002), by using current pulses to gate information through a circuit, it provides a neuronal-population-based means of dynamically propagating graded information through a neural circuit.
We derive our pulse-based transfer mechanism using mean-field equations for a current-based neural circuit (see circuit diagram in Fig. 1a) and demonstrate it in an integrate-and-fire neuronal network. Graded current amplitudes are transferred between upstream and downstream populations: A gating pulse excites the upstream population into the firing regime thereby generating a synaptic current in the downstream population. For didactic purposes, we first present results that rely on a square gating pulse with an ongoing inhibition keeping the downstream population silent until the feedforward synaptic current is integrated. We then show how more biologically realistic pulses with shapes filtered on synaptic time-scales may be used for transfer. We argue that our mechanism represents crucial principles underlying what it means to transfer information. We then generalize the mechanism to the case of transfer from one vector of populations to a second vector of populations and show that this naturally leads to a framework for generating linear maps under the active control of a pulse generator.
In the Results section, we demonstrate pulse-gated transfer solutions in both mean-field and integrate-and-fire neuronal networks. We demonstrate the robustness of the mechanism to noise in pulse timing, synaptic strength and finite-size effects and show how biologically realistic pulses may be used for gating. We then go on to present three examples of circuits that make use of the framework for generating actively controlled linear maps.
In the Discussion section, we consider some of the implications of our mechanism and information coding framework, and future work.
2 Methods
What are the crucial principles underlying information transfer between populations of neurons? First, a carrier of information must be identified, such as synaptic current, firing rate, spike timing, etc. Once the carrier has been identified, we must determine the type of information, i.e. is the information analog or digital? Finally, we must identify what properties the information must exhibit for us to say that information has been transferred. In the mechanism that we present below, we use synaptic current as the information carrier. Information is graded and represented in a current amplitude and thus is best considered analog. The property that identifies information transfer is that the information exhibit a discrete, time-translational symmetry. That is, the waveform representing a graded current or firing rate amplitude in a downstream neuronal population must be the same as that in an upstream population, but shifted in time.
As noted in the Introduction, mechanisms exist for propagating constant activity that have demonstrated time-translational symmetries in both strong (Diesmann et al. 1999) and sparsely coupled (Jahnke et al. 2013) regimes. Here, we address a mechanism for propagation of graded activity.
An additional consideration for biologically realistic information transfer is that it be dynamically routable. That is, that neural pathways may be switched on the millisecond time scale. This is achieved in our mechanism via pulse gating.
2.1 Circuit model
Our neuronal network model consists of a set of j=1,…,M populations, each with i=1,…,N, of current-based, integrate-and-fire (I&F) point neurons. Individual neurons have membrane potentials, v i,j , described by
and feedforward synaptic current
with total currents
and V L e a k is the leakage potential. The excitatory gating pulse on neurons in population j is
where 𝜃(t) is the Heaviside step function: 𝜃(t)=0, t<0 and 𝜃(t)=1, t>0. The ongoing inhibitory current is \(I_{j}^{Inh}(t) = I_{0}^{Inh}\).
Here, τ is a current relaxation timescale depending on the type of neuromodulator (typical time constants are τ A M P A ∼3−11 ms or τ N M D A ∼60−150 ms). Individual spike times, \(\left \{ t_{i,j}^{k} \right \}\), with k denoting spike number, are determined by the time when the voltage v i,j reaches the threshold voltage, V T h r e s , at which time the voltage is reset to V R e s e t . We use units in which only time retains dimension (in seconds) (Shelley and McLaughlin 2002): the leakage conductance is g L e a k =50/sec. We set V R e s e t =V L e a k =0 and normalize the membrane potential by the difference between the threshold and reset potentials, V T h r e s −V R e s e t =1. For the simulations reported here, we use \(I_{0}^{Exc} = 180/\sec \) and \(I_{0}^{Inh} = 150/\sec \). Synaptic background activity is modeled by introducing noise in the excitatory pulse amplitude via 𝜖, where 𝜖∼N(0,σ 2), with σ=1/sec. The probability that neuron i in population j synapses on neuron k in population j+1 is P i k =p. In our simulations, p N=80.
This network is effectively a synfire chain with prescribed pulse input (Abeles 1982; Diesmann et al. 1999; Vogels and Abbott 2005; Shinozaki et al. 2010; Kistler and Gerstner 2002).
2.2 Mean-field equations
Averaging (coarse-graining) spikes over time and over neurons in population j (see, e.g. Shelley and McLaughlin 2002) produces a mean firing rate equation given by
where g T o t a l =g L e a k , and
The feedforward synaptic current, I j+1, is described by
The downstream population receives excitatory input, m j , with synaptic coupling, S, from the upstream population. As in the I&F simulation, we set, V R e s e t =0, and non-dimensionalize the voltage using V T h r e s −V R e s e t =1, so that
This relation, the so-called f-I curve, can be approximated by
near I≈I 0, where m ′(I 0)≈1 (here the prime denotes differentiation), and letting g 0=m ′(I 0)I 0−m(I 0) be the effective threshold in the linearized f-I curve.
2.3 Exact transfer
We consider transfer between an upstream population and a downstream population, denoted by j=u and j+1=d.
For the downstream population, for t<0, I d =0. This may be arranged as an initial condition or by picking a sufficiently large \(I_{0}^{Inh}\), with
At t=0, the excitatory gating pulse is turned on for the upstream population for a period T, so that for 0<t<T, the synaptic current of the downstream population obeys
Therefore, we set the amplitude of the excitatory gating pulse to be \(I_{0}^{Exc}=I_{0}^{Inh} + g_{0}\) to cancel the threshold. Making the ansatz I u (t)=A e −t/τ, we integrate
to obtain the expression
During this time, ongoing inhibition is acting on the downstream population to keep it from spiking, i.e., we have
For T<t<2T, the downstream population is gated by an excitatory pulse, while the upstream population is silenced by ongoing inhibition. The downstream synaptic current obeys
with
so that we have
and
For exact transfer, we need I d (t−T)=I u (t), therefore we write
So we have exact transfer with
To recap, we have the solution, with S e x a c t ,
and
2.4 A synfire-based gating mechanism
In our exact solution, gating pulses have biologically unrealistic instantaneous onset and offset. Therefore, it becomes important to understand how robust graded propagation can be for gating pulses of realistic shape, and is there a natural mechanism for their generation? To test the structural robustness of graded propagation with a known pulse-generating mechanism, we implemented an I&F neuronal network model with two sets of populations, one set had synaptic strengths such that it formed stereotypical pulses with fixed mean spiking profile and mean current waveform (Diesmann et al. 1999; Kistler and Gerstner 2002). The second set used these pulses, instead of square gating pulses, for current propagation. We call this neural circuit a Synfire-Gated Synfire Chain (SGSC).
Individual I&F neurons in the SGSC have membrane potentials described by
where i=1,…,N σ , j=1,…,M and σ,σ ′=1,2 with 1 for the graded chain and 2 for the gating chain; individual spike times, \(\left \{ t^{\sigma , k}_{i,j} \right \}\), with k denoting spike number, are determined by the time when \(v^{\sigma }_{i,j}\) reaches V T h r e s . The gating chain receives a noise current, \(I_{ij}^{2}\), generated from Poisson spike times, \(\{ s_{i,j}^{k} \}\), with strength f 2=0.05 and rate ν 2=400 Hz, i.e. a noise current averaging 20/sec that is subthreshold (given by g l e a k =50/sec). The current \(I^{\sigma \sigma ^{\prime }}_{i,j}\) is the synaptic current of the σ population produced by spikes of the σ ′ population. In the simulations reported in Results, τ=5 msec and the synaptic coupling strengths are {S 11,S 12,S 21,S 22}={2.28,0.37,0,2.72}. The probabilities that a neuron in population σ ′ synapses on a neuron in population σ are given by {p 11,p 12,p 21,p 22}={0.02,0.01,0,0.8}. The two chains have population size {N 1,N 2}={1000,100}. There was a synaptic delay of 4 ms between each successive layer in the gating chain.
2.5 Information processing using graded transfer mechanisms
Because for our mechanism current amplitude transfer is in the linear regime, downstream computations may be considered as linear maps (matrix operations) on a vector of neuronal population amplitudes. For instance, consider an upstream vector of neuronal populations with currents, I u, connected via a connectivity matrix K to a downstream vector of neuronal populations, I d:
With feedforward connectivity, given by the matrix K, the current amplitude, I d, from the mean-field model obeys
where p u(t) denotes a vector gating pulse on layer j. This results in the solution I d(t−T)=P K I u(t), where P is a diagonal matrix with the gating pulse vector, p, of 0s and 1s on the diagonal indicating which neurons were pulsed during the transfer.
For instance, if the matrix of synaptic weights, K, were square and orthogonal, the transformation would represent an orthogonal change of basis in the vector space \(\mathbb {R}^{n}\), where n is the number of populations in the vector. Convergent and divergent connectivities would be represented by non-square matrices.
This type of information processing is distinct from concatenated linear maps in the sense that information may be dynamically routed via suitable gating. Thus, we can envision information manipulation by sets of non-abelian operators, i.e., with non-commuting matrix generators, that may be flexibly coupled. We can also envision re-entrant circuits or introducing pulse-gated nonlinearities into our circuit to implement regulated feedback.
2.6 Information coding framework
Our discussion has identified three components of a unified framework for information coding:
-
1.
information content - graded current, I
-
2.
information processing - synaptic weights, K
-
3.
information control - pulses, p
Note that the pulsing control, p, serves as a gating mechanism for routing neural information into (or out of) a processing circuit. We, therefore, refer to amplitude packets, I, that are guided through a neural circuit by a set of stereotyped pulses as “bound” information.
Consider one population coupled to multiple downstream populations. Separate downstream processing circuits may be multiplexed by pulsing one of the set of downstream circuits. Similarly, copying circuit output to two (or more) distinct downstream populations may be performed by pulsing two populations that are identically coupled to one upstream population.
In order to make decisions, non-linear logic circuits would be required. Many of these are available in the literature (Cassidy et al. 2013; Vogels and Abbott 2005). Simple logic gates should be straightforward to construct within our framework by allowing interaction between information control and content circuits. For instance, to construct an AND gate, use gating pulses to feed two sub-threshold outputs into a third population, if the inputs are (0,0), (0,1) or (1,0), none of the combined pulses exceed threshold and no output is produced. However, the input (1,1) would give rise to an output pulse. Other logic gates, including the NOT may be constructed, giving a Turing complete set of logic gates. Thus, these logic elements could be used for plastic control of functional connectivity, i.e. the potential for rapidly turning circuit elements on or off, enabling information to be dynamically processed.
3 Results
3.1 Exact transfer
In Fig. 1, we demonstrate our current amplitude transfer mechanism in both mean-field and spiking models. The neural circuit for one upstream and one downstream layer is shown in Fig. 1a. Figure 1b and e show the exact, mean-field transfer solution for T=τ=4 ms and T=2τ=8 ms. Figure 1c and f show corresponding transfer between populations of N=100 current-based, I&F neurons. Figure 1d and g show mean currents computed from simulations of I&F networks with N=100. Mean amplitude transfer for these populations is very nearly identical to the exact solution and, as may be seen, graded amplitudes are transferred across many synapses and are still very accurately preserved. Figure 1h shows the exact mean-field transfer solution between populations gated for T/τ=0.8 and T/τ=1.2 with τ=5 ms. Figure 1i shows the corresponding transfer between populations of N=100 I&F neurons. Figure 1j shows how integration period, T, may be changed within a sequence of successive transfers within an I&F network with a value of S e x a c t that supports two different timescales.
This mechanism has a number of features that are represented in the analytic solution, Eqs. (11)–(12a), and in Fig. 1: 1) Exact transfer is possible for any T and τ. This means that transfer may be enacted on a wide range of time scales. This range is set roughly by the value of S e x a c t . Roughly, 0.1<T/τ<4 gives S small enough that firing rates are not excessive in the corresponding I&F simulations. 2) τ sets the “reoccupation time” of the upstream population. After one population has transferred its amplitude to another, the current amplitude must fall sufficiently close to zero for a subsequent exact transfer. Therefore, synapses mediated by AMPA (NMDA) may allow repeated exact transfers. 3) Pulse-gating controls information flow, not information content. As an example, one upstream population may be synaptically connected to two (or more) downstream populations. A graded input current amplitude may then be selectively transferred downstream depending on whether one, the other, or both downstream populations are pulsed appropriately. This allows the functional connectivity of neural circuits to be plastic and rapidly controllable by pulse generators. 4) S e x a c t has an absolute minimum at T/τ=1, and, except at the minimum, there are always two values of T/τ that give the same value of S. This means, for instance, that an amplitude transferred via a short pulse may subsequently be transferred by a long pulse and vice versa (see Fig. 1h, i, j). Thus, not only may downstream information be multiplexed using pulse-based control, but the time scale of the mechanism may also be varied from transfer to transfer.
The means by which the mechanism can fail are also readily apparent: 1) The gating pulses might not be accurately timed. 2) Synaptic strengths might not be correct for exact transfer. 3) The amplitude of the excitatory pulse \(I_{0}^{Exc}\) might not precisely cancel the effective threshold \(I_{0}^{Inh} + g_{0}\). 4) The mean-field approximation might break down due to too few neurons in the neuronal populations.
3.2 Robustness to variability in pulse timing, synaptic strength and finite size effects
In Fig. 2, we investigate mean current variability for the transfer mechanism in the spiking model due to the modes of failure discussed above for T/τ=1 with τ=4 ms. Figure 2a shows the distribution of mean current amplitudes averaged over populations of N=1000 neurons, calculated from 1000 realizations. Figure 2b shows the distribution with just N=100. Clearly, more neurons per population gives less variability in the distribution. The signal-to-noise ratio (SNR) decreases as the square-root of the number of neurons per population, as would be expected. Thus, for circuits needing high accuracy, neuronal recruitment would increase the SNR. Figure 2c shows the distribution for N=100 with 10 % jitter in pulse start and end times. Figure 2d shows the distribution for N=100 with 2 % jitter in synaptic coupling, S. Note that near T/τ=1, S e x a c t varies slowly, thus the effect of both timing and synaptic coupling jitter on the stability of the transfer is minimal. Pulse timing, synaptic strengths, synaptic recruitment, and pulse amplitudes are regulated by neural systems. So mechanisms are already known that could allow networks to be optimized for graded current amplitude transfer.
3.3 A synfire-gated synfire chain
We examine our pulse-gating mechanism in a biologically realistic circuit. Instead of using square gating pulses with unrealistic on- and offset times, we show that we can use the synaptic current generated by a well-studied synfire chain model (Diesmann et al. 1999) to gate the synaptic current transfer. In the graded transfer chain, the downstream (d) population receives excitatory synaptic currents from both the upstream (u) population current and from the corresponding synfire (s) population (see Fig. 3a). Note that the synaptic currents generated by the synfire chain play the role of the gating pulse, allowing the upstream population to transfer its current to the downstream population in a graded fashion. In Fig. 3b, we show that our circuit can indeed transfer a stereotypical synaptic current pulse in a graded fashion (3 current amplitudes shown) in an I&F simulation with M=12 layers. By itself, the gating synfire population is dynamically an attractor, with firing rates of fixed waveform in each layer (Diesmann et al. 1999), producing a stereotypical gating current that is repeated across all layers (Fig. 3d). Figure 3c shows spike times in the populations transferring graded currents and Fig. 3e shows spike times in the gating populations.
These results (exact transfer in an analytically tractable mean-field model, the corresponding I&F neuronal network simulations, and the more biologically realistic SGSC model) demonstrate the structural robustness of our graded transfer mechanism. In these cases, a key essential theoretical mechanism was the time window of integration provided by a gating synaptic current (either put in by hand or generated intrinsically by a subpopulation of the neuronal circuit in the SGSC case). We note that, in neural circuits in vivo, time windows provided by gating pulses can be set and controlled by many mechanisms, for instance, time-scales of excitatory and inhibitory postsynaptic currents, absolute and relative refractoriness of individual neurons, time-scales in a high-conductance network state (Shelley et al. 2002; Destexhe et al. 2003) and coherence of the network dynamics. Indeed, different parts of the brain may use different combinations of neuronal and network mechanisms to implement graded current transfer.
3.4 A high-fidelity memory circuit
As a first complete example of how graded information may be processed in circuits using pulse-gating, we demonstrate a memory circuit using the mean-field model. Our circuit generalizes the IM model by allowing for graded memory and arbitrary multiplexing of memory to other neural circuits. Because it is a population model, it is more robust to perturbations than the IM model, which transfers spikes between individual neurons. It is different from the IM model in that our circuit retains only one graded amplitude, not many (although this could be arranged) (Lisman and Idiart 1995). However, our model retains the multiple timescales that generate theta and gamma oscillations from pulse gating inherent to the IM model (Lisman and Idiart 1995). Additionally, other graded memory models based on input integration (Seung et al. 2000; Goldman 2008) make use of relatively large time constants that are larger even than NMDA timescales, whereas ours makes use of an arbitrary synaptic timescale, τ, which may be modified to make use of any natural timescale in the underlying neuronal populations, including AMPA or NMDA. Our model is based on exact, analytical expressions, and because of this, the memory is infinitely long-lived at the mean-field level (until finite-size effects and other sources of variability are taken into account).
The circuit has four components, a population for binding a graded amplitude into the circuit (‘read in’), a cyclical memory, a ‘read out’ population meant to emulate the transfer of the graded amplitude to another circuit, and an input population. The memory is a set of n populations coupled one to the other in a circular chain with one of the populations (population 1) receiving gated input from the read in population. Memory populations receive coherent, phase shifted (by phase T) pulses that transfer the amplitude around the chain. In this circuit, n must be large enough that when population n transfers its amplitude back to population 1, population 1’s amplitude has relaxed back to (approximately) zero. The read out is a single population identically coupled to every other population in the circular chain. This population is repeatedly pulsed allowing the graded amplitude in the circular chain to be repeatedly read out.
In Fig. 4, we show an example of the memory circuit described here with n=6. The gating pulses sequentially propagate the graded current amplitude around the circuit. The read out population is coupled to every other population in the memory. Thus, in this example, the oscillation frequency of the read out population is three times that of the memory populations, i.e. theta-band frequencies in the memory populations would give rise to gamma-band frequencies in the read out.
This memory circuit, and other circuits that we present below, has the property that the binding of information is instantiated by the pulse sequence and is independent of the information carried in graded amplitudes and also independent of synaptic processing. Because of the independence of the control apparatus from information content and processing, this neural circuit is an automatic processing pathway whose functional connectivity (both internal and input/output) may be rapidly switched on or off and coupled to or decoupled from other circuits. We propose that such dynamically routable circuits, including both processing and control components, are the neural correlates of automatic cognitive processes that have been termed zombie modes (Crick and Koch 2003).
3.5 A moving window fourier transform
The memory circuit above used one-to-one coupling. It was simple in that information was copied, but not processed. Our second example demonstrates how more complex information processing may be accomplished within a zombie mode. With a simple circuit that performs a Hadamard transform (a Fourier transform using square-wave-shaped Walsh functions as a basis), we show how streaming information may be bound into a memory, then processed via synaptic couplings between populations in the circuit.
A set of read in populations are synaptically coupled to the input. A set of memory chains are coupled to the read in. The final population in each memory chain is coupled via a connectivity matrix that implements a Hadamard transform. Gating pulses cause successive read in and storing in memory of the input until the Hadamard transform is performed once the memory contains all successive inputs in a given time window simultaneously. Because the output of the Hadamard transform may be negative, two populations of Hadamard outputs are implemented, one containing positive coefficients, and another containing absolute values of negative coefficients.
In Fig. 5, we show a zombie mode where four samples are bound into the circuit from an input, which changes continuously in time. Memory populations hold the first sample over four transfers, the second sample over three transfers, etc. Once all samples have been bound within the circuit, the Hadamard transform is performed with a pulse on the entire set of Hadamard read out populations. While this process is occurring, a second sweep of the algorithm begins and a second Hadamard transform is computed.
The connectivity matrix for the positive coefficients of the Hadamard transform was given by
and the absolute values of the negative coefficients used the transform −H.
3.6 A re-entrant spatial rotation circuit
Our final example makes use of plastic internal connectivity to perform an arbitrary set of rotations of a vector on the sphere. Three fixed angle rotations about the x, y and z axes are arranged such that the output from each rotation may be copied to the input of any of the rotations. Because the destination is determined by the pattern of gating pulses, this circuit is more general than a zombie mode with a fixed gating pattern because it is not automatic: manipulation of the rotations would be expected to occur from a separate routing control circuit (here, implemented by hand).
In Fig. 6, initial spatial coordinates of (1,1,1) were input to the circuit. The pulse sequence rotated the input first about the x-axis, then sequentially about y, z, x, z, y, y, z, x axes. Views from two angles illustrate the rotations that were performed by the circuit.
This circuit demonstrates the flexibility of the information coding network that we have introduced. It shows a complex circuit capable of rapid computation with dynamic routing, but with a fixed connectivity matrix. Additionally, it is an example of how a set of non-commuting generators may be used to form elements of a non-abelian group within our framework.
4 Discussion
The existence of graded transfer mechanisms, such as the one that we have found, points toward a natural modular organization wherein each neural circuit would be expected to have 1) sparsely coupled populations of neurons that encode information content, 2) pattern generators that provide accurately timed pulses to control information flow, and 3) regulatory mechanisms for maintaining optimal transfer.
A huge literature now exists implicating oscillations as an important mechanism for information coding. Our mechanism provides a fundamental building block with which graded information content may be encoded and transferred in current amplitudes, dynamically routed with coordinated pulses, and transformed and processed via synaptic weights. From this perspective, coherent oscillations may be an indication that a neural circuit is performing complex computations pulse by pulse.
Our mechanism for graded current transfer has allowed us to construct a conceptual framework for the active manipulation of information in neural circuits. The most important aspect of this type of information coding is that it separates control of the flow of information from information processing and the information itself. This type of segregation has been made use of previously (Lisman and Idiart 1995; Jensen and Lisman 2005; Jahnke et al. 2014a,2014b) in mechanisms for gating the propagation of fixed amplitude waveforms. Here, by generalizing the mechanism to the propagation of graded information, active linear maps take prominence as a key processing structure in information coding.
The four functions that must be served by a neural code are (Perkel and Bullock (1968) and Kumar et al. (2010)): stimulus representation, interpretation, transformation and transmission. Our framework serves three of these functions, and we believe is capable of being extended to the fourth. From last to first: the exact transfer mechanisms that we have identified serve the transmission function; synaptic couplings provide the capability of transforming information; the pulse dependent, selective read out of information in part serves the interpretation function. In the examples that we showed above changes in pulse sequences were introduced by hand, but we argued in the Methods section that interaction of pulse chains should be able to achieve fully general decision making. Finally, read in populations, as we demonstrated in our examples, may be used to convert stimulus information into a bound information representation.
The current transfer mechanism is sufficiently flexible that the pulses used for gating may be of different durations depending on the pulse length, T, and the time constant, τ, of the neuronal population involved.
The separation of control populations from those representing information content distinguishes our framework from mechanisms such as CTC, where communication between neuronal populations depends on the coincidence of integration windows in phase-coherent oscillations. In the CTC mechanism, information containing spikes must coincide to be propagated. In our framework, information containing spikes must coincide with gating pulses that enable communication. In this sense, it is ‘communication through coherence with a control mechanism’.
The separation of control and processing has further implications, one of which is that, as noted above, while a given zombie mode is processing incoming information, one does not expect the pulse sequence to change dependent on the information content. This has been seen in experiment, and presented as an argument against CTC in the visual cortex (Thiele and Stoner 2003), but is consistent with our framework.
The basic unit of computation in our framework is a pulse gated transfer. Given this, we suggest that each individual pulse within an oscillatory set of pulses represents the transfer and processing of a discrete information packet. For example, in a sensory circuit that needs to quickly and repeatedly process a streaming external stimulus, short pulses could be repeated in a stereotyped, oscillatory manner using high-frequency gamma oscillations to rapidly move bound sensory information through the processing pathway. Circuits that are used occasionally or asynchronously might not involve oscillations at all, just a precise sequence of pulses that gate a specific information set through a circuit. A possible example of such an asynchronous circuit is bat echo-location, an active sensing mechanism, where coherent oscillations have not been seen (Yartsev et al. 2011).
An important point to note is that, given a zombie mode that implements an algorithm for processing streaming input, one can straightforwardly predict the rhythms that the algorithm should produce (for instance in our examples, calculate power spectra of the current amplitudes.). This feature of zombie modes can provide falsifiable hypotheses for putative computations that the brain uses to process information.
Since, with our transfer mechanism, information routing is enacted via precisely timed pulses, neural pattern generators would be expected to be information control centers. Cortical pattern generators, such as those proposed by Yuste et al. (2005) or hubs proposed by Jahnke et al. (2014a), because of their proximity to cortical circuits, could logically be the substrate for zombie mode control pulses. They would be expected to generate sequential, stereotyped pulses to dynamically route information flow through a neural circuit, as has been found in rat somatosensory cortex (Luczak et al. 2007). Global routing of information via attentional processes, on the other hand, would be expected to be performed from brain regions with broad access to the cortex, such as regions in the basal ganglia or thalamus.
Recent evidence shows that parvalbumin-positive basket cells (PVBCs) can gate the conversion of current to spikes in the amygdala (Wolff et al. 2014). Also, PVBCs and oriens lacunosum moleculare (OLM) cells have been implicated in precision spiking related to gamma- and theta-oscillations (Varga et al. 2012) and shown to be involved in memory-related structural-plasticity (Klausberger et al. 2003). Therefore, zombie mode pattern generators would likely be based on a substrate of these neuron types.
References
Abeles, M. (1982). Role of the cortical neuron: integrator or coincidence detector? Israel Journal of Medical Sciences, 18, 83–92.
Adrian, E.D., & Zotterman, Y. (1926). The impulses produced by sensory nerve-endings: Part II. The response of a single-end organ. The Journal of Physiology, 61, 151–171.
Azouz, R., & Gray, C.M. (2000). Dynamic spike threshold reveals a mechanism for synaptic coincidence detection in cortical neurons in vivo. Proceedings of the National Academy of Sciences of the United States of America, 97, 8110–8115.
Bair, W., & Koch, C. (1996). Temporal precision of spike trains in extrastriate cortex of the behaving macaque monkey. Neural Computation, 8, 1185–1202.
Bauer, M., Oostenveld, R., Peeters, M., & Fries, P. (2006). Tactile spatial attention enhances gamma-band activity in somatosensory cortex and reduces low-frequency activity in parieto-occipital areas. The Journal of Neuroscience, 26, 490–501.
Bienenstock, E. (1995). A model of neocortex. Network: Computation in Neural Systems, 6, 179–224.
Bosman, C.A., Schoffelen, J.M., Brunet, N., Oostenveld, R., Bastos, A.M., Womelsdorf, T., Rubehn, B., Stieglitz, T., de Weerd, P., & Fries, P. (2012). Stimulus selection through selective synchronization between monkey visual areas. Neuron, 75, 875–888.
Bragin, A., Jandó, G., Nádasdy, Z., Hetke, J., Wise, K., & Buzsáki, G. (1995). Gamma (40−100 Hz) oscillation in the hippocampus of the behaving rat. The Journal of Neuroscience, 15, 47–60.
Brosch, M., Budinger, E., & Scheich, H. (2002). Stimulus-related gamma oscillations in primate auditory cortex. Journal of Neurophysiology, 87, 2715–2725.
Brunel, N., & Hakim, V. (1999). Fast global oscillations in networks of integrate-and-fire neurons with low firing rates. Neural Computation, 11, 1621–1671.
Buschman, T.J., & Miller, E.K. (2007). Top-down versus bottom-up control of attention in the prefrontal and posterior parietal cortices. Science, 315, 1860–1862.
Butts, D.A., Weng, C., Jin, J., Yeh, C.I., Lesica, N.A., Alonso, J.M., & Stanley, G.B. (2007). Temporal precision in the neural code and the timescales of natural vision. Nature, 449, 92–95.
Buzsáki, G. (2002). Theta oscillations in the hippocampus. Neuron, 33, 325–340.
Cassidy, A.S., Merolla, P., Arthur, J.V., Esser, S.K., Jackson, B., Alvarez-Icaza, R., Datta, P., Sawada, J., Wong, T.M., Feldman, V., Amir, A., Rubin, D.B.-D., Akopyan, F., McQuinn, E., Risk, W.P., & Modha, D.S. (2013). Cognitive computing building block: A versatile and efficient digital neuron model for neurosynaptic cores. In The 2013 International Joint Conference on Neural Networks (IJCNN). doi:10.1109/IJCNN.2013.6707077 (pp. 1–10).
Colgin, L., Denninger, T., Fyhn, M., Hafting, T., Bonnevie, T., Jensen, O., Moser, M., & Moser, E. (2009). Frequency of gamma oscillations routes flow of information in the hippocampus. Nature, 462, 75–78.
Crick, F., & Koch, C. (2003). A framework for consciousness. Nature Neuroscience, 6, 119–126.
Csicsvari, J., Jamieson, B., Wise, K., & Buzsáki, G. (2003). Mechanisms of gamma oscillations in the hippocampus of the behaving rat. Neuron, 37, 311–322.
Destexhe, A., Rudolph, M., & Pare, D. (2003). The high-conductance state of neocortical neurons in vivo. Nature Reviews. Neuroscience, 4(9), 739–751.
Diesmann, M., Gewaltig, M.O., & Aertsen, A. (1999). Stable propagation of synchronous spiking in cortical neural networks. Nature, 402, 529–533.
Feinerman, O., & Moses, E. (2006). Transport of information along unidimensional layered networks of dissociated hippocampal neurons and implications for rate coding. The Journal of Neuroscience, 26(17), 4526–4534.
Fries, P. (2005). A mechanism for cognitive dynamics: neuronal communication through neuronal coherence. Trends in Cognitive Sciences, 9, 474–480.
Fries, P., Reynolds, J.H., Rorie, A.E., & Desimone, R. (2001). Modulation of oscillatory neuronal synchronization by selective visual attention. Science, 291, 1560–1563.
Fries, P., Womelsdorf, T., Oostenveld, R., & Desimone, R. (2008). The effects of visual stimulation and selective visual attention on rhythmic neuronal synchronization in macaque area V4. The Journal of Neuroscience, 28, 4823–4835.
Gerstner, W. (1995). Time structure of the activity in neural network models. Physical Review E, 51, 738–758.
Goldman, M.S. (2008). Memory without feedback in a neural network. Neuron, 61, 621–634.
Gray, C.M., König, P., Engel, A.K., & Singer, W. (1989). Oscillatory responses in cat visual cortex exhibit inter-columnar synchronization which reflects global stimulus properties. Nature, 338, 334–337.
Gregoriou, G.G., Gotts, S.J., Zhou, H., & Desimone, R. (2009). High-frequency, long-range coupling between prefrontal and visual cortex during attention. Science, 324, 1207–1210.
Henrie, J.A., & Shapley, R. (2005). LFP power spectra in V1 cortex: the graded effect of stimulus contrast. Journal of Neurophysiology, 94, 479–490.
Hubel, D.H., & Wiesel, T.N. (1965). Receptive fields and functional architecture in two non striate visual areas (18 and 19) of the cat. Journal of Neurophysiology, 28, 229–289.
Hubel, D.H., & Wiesel, T.N. (1968). Receptive fields and functional architecture of monkey striate cortex. The Journal of Physiology, 195, 215–243.
Jahnke, S., Memmesheimer, R.M., & Timme, M. (2013). Propagating synchrony in feed-forward networks. Frontiers in Computational Neuroscience, 7, 153.
Jahnke, S., Memmesheimer, R.M., & Timme, M. (2014a). Hub-activated signal transmission in complex networks. Physical Review. E, Statistical, Nonlinear, and Soft Matter Physics, 89(3), 030701.
Jahnke, S., Memmesheimer, R.M., & Timme, M. (2014b). Oscillation-induced signal transmission and gating in neural circuits. PLoS Computational Biology, 10(12), e1003940.
Jensey, O., & Lisman, J.E. (2005). Hippocampal sequence-encoding driven by a cortical multi-item working memory buffer. Trends in Neurosciences, 28, 67–72.
Kaissling, K.E., & Priesner, E. (1970). Smell threshold of the silkworm. Naturwissenschaften, 57, 23–28.
Kistler, W.M., & Gerstner, W. (2002). Stable propagation of activity pulses in populations of spiking neurons. Neural Computation, 14, 987–997.
Klausberger, T., Magill, P.J., Marton, L.F., Roberts, J.D., Cobden, P.M., Buzsaki, G., & Somogyi, P. (2003). Brain-state- and cell-type-specific firing of hippocampal interneurons in vivo. Nature, 421, 844–848.
Knight, B.W. (1972). Dynamics of encoding in a population of neurons. The Journal of General Physiology, 59, 734–766.
Knight, B.W. (2000). Dynamics of encoding in a population of neurons: some general mathematical features. Neural Computation, 12, 473–518.
König, P., Engel, A.K., & Singer, W. (1996). Integrator or coincidence detector? The role of the cortical neuron revisited. Trends in Neurosciences, 19, 130–137.
Kremkow, J., Aertsen, A., & Kumar, A. (2010). Gating of signal propagation in spiking neural networks by balanced and correlated excitation and inhibition. The Journal of Neuroscience, 30, 15760–15768.
Kumar, A., Rotter, S., & Aertsen, A. (2010). Spiking activity propagation in neuronal networks: reconciling different perspectives on neural coding. Nature Reviews. Neuroscience, 11, 615–627.
Lisman, J.E., & Idiart, M.A. (1995). Storage of 7±2 short-term memories in oscillatory subcycles. Science, 267, 1512–1515.
Litvak, V., Sompolinsky, H., Segev, I., & Abeles, M. (2003). On the transmission of rate code in long feedforward networks with excitatory-inhibitory balance. The Journal of Neuroscience, 23(7), 3006–3015.
Liu, J., & Newsome, W.T. (2006). Local field potential in cortical area MT: Stimulus tuning and behavioral correlations. The Journal of Neuroscience, 26, 7779–7790.
Livingstone, M.S. (1996). Oscillatory firing and interneuronal correlations in squirrel monkey striate cortex. Journal of Neurophysiology, 66, 2467–2485.
Luczak, A., Bartho, P., Marguet, S.L., Buzsaki, G., & Harris, K.D. (2007). Sequential structure of neocortical spontaneous activity in vivo. Proceedings of the National Academy of Sciences of the United States of America, 104, 347–352.
Markowska, A.L., Olton, D.S., & Givens, B. (1995). Cholinergic manipulations in the medial septal area: age-related effects on working memory and hippocampal electrophysiology. The Journal of Neuroscience, 15, 2063–2073.
Medendorp, W.P., Kramer, G.F., Jensen, O., Oostenveld, R., Schoffelen, J.M., & Fries, P. (2007). Oscillatory activity in human parietal and occipital cortex shows hemispheric lateralization and memory effects in a delayed double-step saccade task. Cerebral Cortex, 17, 2364–2374.
O’Keefe, J. (1993). Hippocampus, theta, and spatial memory. Current Opinion in Neurobiology, 3, 917–924.
Perkel, D.H., & Bullock, T.H. (1968). Neural coding: a report based on an NRP work session. Neurosciences Research Program Bulletin, 6, 219–349.
Pesaran, B., Pezaris, J.S., Sahani, M., Mitra, P.P., & Andersen, R.A. (2002). Temporal structure in neuronal activity during working memory in macaque parietal cortex. Nature Neuroscience, 5, 805–811.
Popescu, A.T., Popa, D., & Paré, D. (2009). Coherent gamma oscillations couple the amygdala and striatum during learning. Nature Neuroscience, 12, 801–807.
Quian Quiroga, R., & Panzeri, S. (Eds.) (2013). Principles of neural coding. London: CRC Press.
Reyes, A.D. (2003). Synchrony-dependent propagation of firing rate in iteratively constructed networks in vitro. Nature Neuroscience, 6(6), 593–599.
Rubin, J.E., & Terman, D. (2004). High frequency stimulation of the subthalamic nucleus eliminates pathological thalamic rhythmicity in a computational model. Journal of Computational Neuroscience, 16, 211–235.
Saalmann, Y.B., Pigarev, I.N., & Vidyasagar, T.R. (2007). Neural mechanisms of visual attention: how top-down feedback highlights relevant locations. Science, 316, 1612–1615.
Salinas, E., & Sejnowski, T.J. (2001). Correlated neuronal activity and the flow of neural information. Nature Reviews. Neuroscience, 2, 539–550.
Seung, S.H., Lee, D.D., Reis, B.Y., & Tank, D.W. (2000). Stability of the memory of eye position in a recurrent network of conductance-based model neurons. Neuron, 26, 259–271.
Shelley, M., & McLaughlin, D. (2002). Coarse-grained reduction and analysis of a network model of cortical response: I. Drifting grating stimuli. Journal of Computational Neuroscience, 12(2), 97–122.
Shelley, M., McLaughlin, D., Shapley, R., & Wielaard, J. (2002). States of high conductance in a large-scale model of the visual cortex. Journal of Computational Neuroscience, 13(2), 93–109.
Shinozaki, T., Okada, M., Reyes, A.D., & Cateau, H. (2010). Flexible traffic control of the synfire-mode transmission by inhibitory modulation: nonlinear noise reduction. Physical Review. E, Statistical, Nonlinear, and Soft Matter Physics, 81, 011913.
Sirovich, L., Knight, B.W., & Omurtag, A. (1999). Dynamics of neuronal populations: the equilibrium solution. SIAM Journal on Applied Mathematics, 60, 2009–2028.
Skaggs, W.E., McNaughton, B.L., Wilson, M.A., & Barnes, C.A. (1996). Theta phase precession in hippocampal neuronal populations and the compression of temporal sequences. Hippocampus, 6, 149–172.
Sohal, V.S., Zhang, F., Yizhar, O., & Deisseroth, K. (2009). Parvalbumin neurons and gamma rhythms enhance cortical circuit performance. Nature, 459, 698–702.
Thiele, A., & Stoner, G. (2003). Neuronal synchrony does not correlate with motion coherence in cortical area MT. Nature, 421, 366–370.
van Rossum, M.C., Turrigiano, G.G., & Nelson, S.B. (2002). Fast propagation of firing rates through layered networks of noisy neurons. The Journal of Neuroscience, 22(5), 1956–1966.
Varga, C., Golshani, P., & Soltesz, I. (2012). Frequency-invariant temporal ordering of interneuronal discharges during hippocampal oscillations in awake mice. Proceedings of the National Academy of Sciences of the United States of America, 109, E2726—2734.
Vogels, T.P., & Abbott, L.F. (2005). Signal propagation and logic gating in networks of integrate-and-fire neurons. The Journal of Neuroscience, 25, 10786–10795.
Winson, J. (1978). Loss of hippocampal theta rhythm results in spatial memory deficit in the rat. Science, 201, 160–163.
Wolff, S.B., Grundemann, J., Tovote, P., Krabbe, S., Jacobson, G.A., Muller, C., Herry, C., Ehrlich, I., Friedrich, R.W., Letzkus, J.J., & Luthi, A. (2014). Amygdala interneuron subtypes control fear learning through disinhibition. Nature, 509, 453–458.
Womelsdorf, T., Schoffelen, J.M., Oostenveld, R., Singer, W., Desimone, R., Engel, A.K., & Fries, P. (2007). Modulation of neuronal interactions through neuronal synchronization. Science, 316, 1609–1612.
Yartsev, M.M., Witter, M.P., & Ulanovsky, N. (2011). Grid cells without theta oscillations in the entorhinal cortex of bats. Nature, 479, 103–107.
Yuste, R., MacLean, J.N., Smith, J., & Lansner, A. (2005). The cortex as a central pattern generator. Nature Reviews. Neuroscience, 6, 477–483.
Acknowledgments
L.T. thanks the UC Davis Mathematics Department for its hospitality. This work was supported by the Ministry of Science and Technology of China through the Basic Research Program (973) 2011CB809105 (W.Z. and L.T.), by the Natural Science Foundation of China grant 91232715 (W.Z. and L.T.) and by the National Institutes of Health, CRCNS program NS090645 (A.S. and L.T.). We thank Tim Lewis and Antoni Guillamon for reading and commenting on a draft of this paper.
Author information
Authors and Affiliations
Corresponding authors
Additional information
Action Editor: Nicolas Brunel
Conflict of interests
The authors declare that they have no conflict of interest.
Rights and permissions
About this article
Cite this article
Sornborger, A.T., Wang, Z. & Tao, L. A mechanism for graded, dynamically routable current propagation in pulse-gated synfire chains and implications for information coding. J Comput Neurosci 39, 181–195 (2015). https://doi.org/10.1007/s10827-015-0570-8
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10827-015-0570-8