Robustness in Neural Circuits

Complex systems are found everywhere – from scheduling to traffic, food to climate, economics to ecology, the brain, and the universe. Complex systems typically have many elements, many modes of interconnectedness of those elements, and often exhibit sensitivity to initial conditions. Complex systems by their nature are generally unpredictable and can be highly unstable.

analytical basis of the entropyrobustness theoremand invokes studies of genetic regulatory networks to provide empirical support for the correlation between complexity and stability. Earlier investigations based on numerical studies of random matrix models and the notion of local stability have led to the claim that complex ecosystems tend to be more dynamically fragile. This article elucidates the basis for this claim which is largely inconsistent with the empirical observations of Bernard, Cannon and Elton. Our analysis thus resolves a long-standing controversy regarding the relation between complex biological systems and their capacity to recover from perturbations. The entropy-robustness principle is a mathematical proposition with implications for understanding the basis for the large variances in stability observed in biological systems having evolved under different environmental conditions. Stability is characterized by measures of how perturbing initial conditions or default parameters result in permanent deviation from baseline behavior vs a tendency to return toward initial conditions (e.g., a basin of attraction) [7][8][9]. Biological robustness is the ability for a system to recover from disturbing its natural healthy equilibrium into deleterious regions of its parameter space [7,[10][11][12][13]. Mathematical analyses in nonlinear dynamical systems theory provide a theoretical foundation for robustness theory [14][15][16]. Since the advent of control theory, engineers have emphasized negative feedback as a restoring mechanism [17][18][19][20]. Much more empirical work at many biological systems levels is needed to marry theory and practice [13,[21][22][23][24][25][26].
On the practical side, by way of example, the medical and neuroscience literature has many examples where thresholds, numbers of neurons, synapses, locations, branching, conductivities, capacitances, impedances, time-varying dynamics, and so forth are not accurate, known, or are even different across publications. Often, sensitivity analysis is critical to understand the underlying dynamics, and the system must compensate for sensitivity to maintain stability and robustness [27]. In this regard, the complexity of neural circuitry systems may compensate for large error bars in parameter accuracy [28][29][30][31].
For practical and theoretical interest, we sought to examine stability and robustness vs complexity in neural circuitry parameter spaces using Monte Carlo simulation.

Node Parameters at Several Systems Levels Granularity
Using the universal neural circuitry simulation (UNCuS) software [32][33][34], we specified a subset of ten neuron parameters for sample spaces composed of different numbers of populations, groups, and numbers of cells (neurons). To globally sample the combined parameter space, we used Monte Carlo methods (Table 1) [27]. Out of a space of~5 x 10 11 parameter configurations we sampled 5 x 10 3 combinations. UNCuS and the Monte Carlo sampling program were written by us in C++ (Microsoft, Redmond, WA, USA) and Java (Oracle Corp., Redwood City, CA, USA). UNCuS was designed specifically to perform computationally efficient neural circuitry simulations for large as well as small circuits. The program integrates essential behavior at the underlying systems level of ion channel conductance and dendritic tree emission current and scales well in projects of the current size [32,35].
The number of populations ranged from two to nine, and each population always had just two groups of neurons, one excitatory and one inhibitory. Groups within a population were never connected to each other but were always connected to all other groups in the other populations, with the same density. What was varied in the configurations was the number of (1) cells per group and (2) neuron parameters of the cells. Figure 1 shows screenshots from UNCuS of example configurations.
We created 50 separate project configurations with identical numbers of cells per group in each case. Initially we looked at the extremes of the parameter space, 7 configurations with 2 populations and 7 configurations with 9 populations, totaling 14 configurations. Then we looked at the intermediate numbers of populations and groups using 6 configurations each with 3, 4, 5, 6, 7, and 8 populations, totaling 36 configurations.
To set the varying number of cells in each population, the program randomly selected a number of cells from the set {5,10, . . ., 145, 150}.
The number of connections from source to target group was fixed at 40%; for example, from a source group of any number of neurons to a target group of, e.g., 100 neurons, each source neuron connected to 40 of the target group's neurons. Groups never recurred back to themselves or the other group in its population, always to exogenous target groups in other populations (Fig. 1); hence in calculating the total number of connections, we subtract one from the total number of populations in the case.
At each target cell, the axon synapsed on two of ten electrotonic compartments in the middle of the dendrite, specifically compartments four and five. See Limitations

Neuron Cell Parameters
Just as for each project case there were a random number of cells in each group, the sampling program also randomly selected a set of neuron cell parameters. In UNCuS, there are 12 parameters to characterize a neuron type. We excluded refractory time and spike width, leaving 10 cell parameters to select from. The parameters, their default baseline values, and the value ranges subjected to random sampling are shown in Table 2. All cells in all groups in each case used the same random selection of cell parameters. For the equations governing cell behavior, see Arle et al. [32,33].
In previous projects, we usually drove a circuit with specific stimuli on specific cell groups such that their initial firing rate corresponded to what was reported in the Fig. 1 Four sample project configurations from universal neural circuitry simulation software UNCuS. Neuron populations (circles) contain numbered groups and connections between groups identified as excitatory (blue) or inhibitory (green). Each neuron in each group connected to 40% of the neurons it targeted. The total number of synapses can be thought of abstractly as the total nodes in the complex system literature. Here we raised background noise to a level that activated all circuits, specifically 1.7 nA, which we found to be sufficient.
A plot of a typical spiking neuron's membrane voltage over time is shown in Fig. 2.

Dynamic Adjustment of Input Amplitude
The program dynamically condensed the input amplitude to each cell from all incoming excitatory or inhibitory post-synaptic potentials (EPSP, IPSP) to a normalized range between À5 and +5 nA. The reasoning here that there is some range in actual neurons beyond stimulating or inhibiting the cell has no greater effect. Reversal potential of the rectifying potassium conductance (À105, À25, 1),À65 mV

Eb
Reversal potential of basic channel À65 mV Gk Rectifying potassium conductance (5, 20, 1), 10 nS Gb "Lumped" ion channel conductance (5, 20, 1), 10 nS Delayed rectifier potassium conductance strength (100, 1000, 1), 500 nS Th0 Threshold transmembrane potential (2.0, 20, 1), 12 mV However, in modeling and simulations, there is often a range of a parameter for which one has validated the behavior of the simulated entity, while, beyond that validation value, behavior is unknown and may be unpredictable. Essentially, the method used was to calculate the maximum possible input current for a given project case based on UNCuS' dendritic emission current look-up table, the known dendritic compartments (#4 and #5), the known number of inputs (40%, e.g., 40 per 100 cells), and then, at each time step, multiply each cell's input by the factor required to condense the scale to À5 to +5 nA.

Simulation Duration, Time Step, and Calculation of Firing Rates
All simulations were run for 1000 ms in 0.25 ms time steps, which we have found to give stable cell behavior, often after an initial transient period. We defined a "momentary firing rate" by moving a window of 100 ms, time step by time step, through the 1000 ms, taking the total number of spikes in each interval, and dividing that number by the interval length, 100 ms, to give spikes/second. This method discards the first 100 ms, which, as said above, often involved a transient period transitioning to steady state behavior. The program recorded the spikes for each neuron, took the momentary firing rate, and averaged over all cells in a group to give a group firing rate, which underlies the data shown in Results.

Definition of "Robustness" via Coefficient of Variance (CV)
We used coefficient of variance (aka "variation") (CV) as a metric for "robustness," where the standard deviation was divided by the spike rate of baseline neuron cell instead of the mean of the Monte Carlo sampled data: where n ¼ 50, the number of population-group configurations (each with a random set of 10-cell parameters), FR i is the firing rate of a given configuration, and FR base is the firing rate with the baseline neural parameters. Thus, the baseline parameters, shown in Table 2, established a firing rate against which the deviation of all Monte Carlo sampled configurations would be checked.

Definition of "Robustness" via an Adapted Lyapunov Exponent
To give a second perspective on how the level of complexity affects the deviation of neural system from its baseline performance, we adapted a standard formula for the Lyapunov exponent as was done with the definition of CV. Thus, the baseline neuron cell firing rate was used as the initial condition against which deviations over time were compared in calculating the Lyapunov exponent (LE) [35,36]: where s is the time step, FR k is the momentary firing rate of the sampled parameter point at time step k, FR base,k is the baseline firing rate at time step k, and s is the total number of time steps. When the sampled configuration firing rate is equal to that of the baseline, there is no divergence and LE ¼ 0.

Cumulative Firing Rate vs Momentary Firing Rate
We calculated cumulative firing rate (CFR) as a global reflection of the circuitry in time, as distinct from the momentary firing rate. The CFR is defined at time t as the total number of spikes up to t divided by t to give spike rate/second.

Limitations
There are many other ways to vary neural circuit parameters than those we chose to explore. Notably the many different topologies of connecting circuits were programmed to perform a specific behavior. Of those, we did not, for instance, explore circuits that must be stable to perform their function, such as clocks and rhythm generators, or that evolved to become emergently stable, such as the various networks revealed by functional connectivity analyses [37]. The space of connection strengths could also be explored more fully. Our past experience building small-and large-scale connectome models [34,38,39] and preliminary results in this study showed that, at the connection strength extremes, very low or high connection density and/or few distal dendritic compartment connections or many proximal dendritic compartment connections gave either no activity or hyperactivity, respectively. On the other hand, one can build circuits with, e.g., high connection density in some parts of the circuit if they are regulated by other parts of the circuit; an effectively infinite number of such specific circuits are possible.
Other neural circuitry simulation software exists, and the variety is evolving rapidly, driven by rapidly increasing interest in building connectome models (see Carlson et al. elsewhere in this book [24]).

Results
In general, in non-quiescent circuits, the momentary firing rate, which characterizes the system's short-term behavior in the time domain, was never constant but dynamic. For the robustness measures, the behavior of excitatory and inhibitory groups was similar, so only results for excitatory groups are shown below.

Sample Time Course of Firing Rate of Two
Population-Group Configurations  Firing rates in the more complex system are higher than in the simple one due to recurrent input from a higher number of cells. The outmost curve in the left case implies that a particular set of ten-parameter value is more sensitive than others, i.e., one of the ten parameters may be more sensitive than the rest of the parameters. CV in the simple system is 0.754 and in the more complex system 0.236 outlier parameter configurations easily seen in the simpler model are also present in the complex model but are to some extent "tamed" by the higher connectedness and number of circuit elements. One effect is that in more complex systems, errant random inputs can cancel each other.

Plots of Firing Rate of All Sample Points vs Baseline Parameters
The three plots in Fig. 4 show all data and examine robustness across increasing complexity measured by the numbers of several different circuit elements.

Robustness vs Number of Elements as Measured by Coefficient of Variance (CV)
The next plots (Fig. 5) show CV (Eq. 1) of Monte Carlo sample firing rates vs complexity parameters compared to the baseline configuration (red line). Note that we take the baseline firing rate as the "expected value" and therefore CV < 1 means the standard deviation of the sampled firing rates is smaller than the firing rate of the baseline (Eq. 1). Each point shows the CV over 50 population configurations, each with a randomly selected set of neuron parameters, and each graph shows 100 such points. Since the total synapses and the combined complexity are linearly related (in our study, synapses are two times the number of projections since each projection synapsed on dendritic compartments 4 & 5), their distributions of CV are similar, and thus data for a number of synapses are omitted.

Robustness vs Number of Elements as Measured by Lyapunov Exponent (LE)
Here we used a different measure of robustness, the Lyapunov exponent (LE, Eq. 2), based on how the firing rates of sampled population-group-neuron parameter sample points diverged over time from those of the baseline neuron parameter case. Yet the results for different levels of complexity were quite similar to the previous results using CV as the metric. Figure 6 shows LE against numbers of populations, cells, and projections. Generally firing rates decrease over time as steady states are reached in the circuits and those of sample points are less than those of baseline cases. Firing rates of sampled configurations were generally higher than those of baseline cases and divergence decreased against total projections in the system. Divergence decreases with increasing system complexity

Robustness vs Number of Elements as Measured by Cumulative Firing Rate (CFR)
The cumulative firing rate (CFR) can be compared to momentary firing rate as a global reflection of the circuitry behavior over time instead of at a given point in time. Figure 7 shows CFR for total populations, total cells, total projections, total projections' divergence from baseline (LE), and total populations' divergence from baseline (CV). Results were similar to simulations shown above.

Key Results
The data suggests that neural circuits become increasingly robust as their complexity is increased, with a relatively small amount of complexity as compared to the number of neurons in various organisms. Synapse totals in the study ranged from 160 to 5.18 x 10 6recall that a 3-year-old human has an estimated 10 15 synapses in their entire brain.
Thus, the inherent nature of a moderately complex and sufficiently interconnected neural circuit results in local parameter changes being unlikely to have significant changes in circuit behavior. Given sufficient connectivity, robustness increases as the number of elements increases and may be sustained once it reaches a window of transition. In our simulations, where the elements were neurons, and other parameters being equal, this transition window to maximal robustness was in the range of 100-1000 cells.
The number of individual synapses appeared to have the least effect on achieving robustness compared to the number of cells or axonal projections.

Robustness and Degeneracy in Biological Systems
Our pragmatic interest is in applying the results of the study and related work to faithfully modeling neurological disorders, such as Parkinson's disease, epilepsy, neuropathic pain, and others, and how to treat them with neuromodulation.
Biological systems have evolved adaptive homeostasis to increase their robustness; conversely, loss of adaptive homeostasis can result in disease and disorder and is a hallmark of aging [1,2,[40][41][42][43][44][45]. More broadly, the term degeneracy in complex Model ExpDec1 Equation

Plot
Coefficient of Varian systems theory has been applied to biological systems to describe a decline in their ability to maintain functional integrity [46], and the application of degeneracy analysis to biology and neural circuits in particular is likely to play a major role in neurological disorders and aging and their treatment [47].

Robustness and Degeneracy in Functional Connectivity Brain Networks
In the past two decades, functional connectivity (FC)the correlated activity between brain regions as identified by imaging and monitoring the brain's electrical activityhas become a major paradigm for understanding healthy and diseased states. The "resting state" and time course of FC networks can constitute signatures of healthy vs diseased conditions. Degeneracy of the healthy FC is identified with the diseased state. In many cases, modelers of neurological disorders and disease may find that system complexity may underlie diseased states. For instance, the reason why, in Parkinson's disease, up to 80% of the dopaminergic neurons of the substantia nigra are lost before movement disorders manifest is unknown. Degeneracy via loss of complexity and connectedness in the basal ganglia may be part of the mechanism.

Inadvertent Modeling Error Due to Scaling
Robustness helps illuminate possible errors in modeling connectomes when scaling the number of elements. When building a complete model of the human spinal cord connectome [34], we found that Rexed lamina II had~1300 times the number of neurons that lamina I had. Thus, in scaling down the total number of neurons to make a more computationally efficient model, we inadvertently reduced the robustness of the model, which showed up as high firing rates in lamina I, and had to scale it back up to correct for that. Such an inadvertent effect would equally invalidate many neural circuit models, notably of seizure.

Conclusion
The complexity of biological systems, notably neural circuitry, as defined by the number of element (neurons) and connectivity (synaptic topology) is a critical element in their robustnessthe degree to which, when perturbed from healthy states toward diseased or disordered states, they inherently attempt to return to the healthy state. We simulated neural circuitry complexity and connectivity in a large parameter space via Monte Carlo sampling. Our results bear out the theory that more complex, connected systems can be inherently more stable than simpler systems. These results have implications for modeling neurological disorders, such as Parkinson's disease, chronic pain, seizure, and age-related cognitive decline, and how to treat these conditions with neuromodulation.