Functional Requirements of Small- and Large-Scale Neural Circuitry Connectome Models

We have truly entered the Age of the Connectome due to a confluence of advanced imaging tools, methods such as the flavors of functional connectivity analysis and inter-species connectivity comparisons, and computational power to simulate neural circuitry. The interest in connectomes is reflected in the exponentially rising number of articles on the subject. What are our goals? What are the “functional requirements” of connectome modelers? We give a perspective on these questions from our group whose focus is modeling neurological disorders, such as neuropathic back pain, epilepsy, Parkinson’s disease, and age-related cognitive decline, and treating them with neuromodulation.


Electroceuticals and Neuromodulation
The ultimate goal of electroceuticals and neuromodulation is to use electromagnetic fields to modify any component of the central and peripheral nervous system in a predictable way to restore or enhance its normal functionality. Neuromodulation focuses more on restoring functionality from a diseased state, while electroceuticals' emphasis is using electromagnetic fields to replicate pharmaceutical effects and thereby provide a far less expensive and time-consuming path than the drug development and regulatory route via an alternative medical device route cutting cost and time as much as 90%.
In the past decade, as numerical modeling and simulation have become more sophisticated, their role in reducing research and development time and expense has become increasingly clear and valuable [9,10]. The Food and Drug Administration and the American Society of Mechanical Engineers have led the way to formalize the simulation paradigm so that its results and their level of validity are satisfactorily transparent [11,12].

Benefits of Numerical Modeling
The benefits of numerical modeling begin with the ability to predict electromagnetic field distributions and the resulting forces and energy imparted by the field to targeted and non-targeted structures. Modeling provides significant advantages over empirical studies particularly under the following conditions, which apply pervasively to the nervous system: 1. Where it is difficult or impossible to measure values empirically 2. Where preclinical studies are expensive or impossible 3. In inhomogeneous, anisotropic materials 4. Where material parameters are imprecisely established Further, modelers can perform extensive "what if" explorations in large parameter spaces either interactively or in an automated batch mode. Using standards such as those mentioned above, modelers can benefit by exchanging their models to compare results or to use another's model as a starting point for new explorations.

The Role of Simple Versus Complex Models
Freddie Hansen of Abbott Laboratories built over 300 simulation models in 6 yearshow did he do it? For most of his models, Hansen uses the COMSOL finite element modeling (FEM) software as a "FEM pocket calculator" with which he builds quickand-dirty FEMs in a few hours to a few days [13]. These models are designed to answer a single, simple question. Roughly defined parameters and large error bars can be addressed with parameter sweeps across the relevant parameters to outline possible range of responses and determine sensitivities.
In the field, such simple models are often called "sub-models," i.e., they are a study of a component from a larger model. The nervous systemeven in simpler model organisms than humansis so complex that sub-modeling will be an important technique for calibration, validation, and dissecting functionality of connectomes.
More complex and sophisticated models, such as Hansen's principal model of a heart pump, take weeks and months to build, calibrate, and validate. Generally, much greater emphasis is placed on validation in complex vs. simple models, and far greater time is required to beat the desired calibration and validation behavior out of the model. Hence, calibration of sub-models within connectomes can be an important route to efficiency and model control.
An example of a connectome sub-model is the H-reflex, which is the relatively simple spinal cord circuit triggered when the doctor hits your knee with a rubber mallet. The connectivity of the monosynaptic H-reflex is well-known (Fig. 2), which is not generally the case with more complex circuits, hence the need to start with what is known and get that calibrated before venturing into unknown territory. Calibration of the H-reflex involves balancing the connection strength between the Ia excitatory sensory fiber, Renshaw cell (RC) inhibitory feedback loop, and alpha motor neuron circuit such that this mini-circuit replicates its reported~50 ms refractory time (Fig. 3).
While one or two such mini-calibrations may suffice for a simple model, many may be necessary in a complex model in order to impose sufficient constraints to validate the model. The growing sophistication of new top-down modeling methods is greatly improving the calibration/validation process (see Sect. 2.5).

Ockham's Razor Drives All Modeling
"What can be accounted for by fewer assumptions is explained in vain by more" was a principle frequently invoked by the influential medieval Scholastic thinker, William of Ockham (1285-1349) [14]. Similarly, in modeling, a guiding principle is to make the model no more complex than is required to capture the desired phenomena. In modeling the human brain and spinal cord, one has little choice but to invoke this principle frequently, because the systems are so complex and intertwined. For example, one cannot model the entire brain to capture a given disorder, such as movement disorder due to loss of dopaminergic neurons in the substantia nigra pars compacta in Parkinson's disease, where the central basal ganglia loop sends and receives signals from external centers. In such a model, vast regions of, e.g., cortex and thalamus may be represented as single groups, or single excitatory and inhibitory pairs [15].

Capturing the Required Level of Detail
Similar to the principle of Ockham's razor, modelers must make a decision about the level of detail they wish to capture and the developmental and computational cost required to capture the target level [16,17]. Biology is fundamentally a multisystems level discipline, and accordingly, modelers must decide at which systems level to focus [18][19][20]. Concomitantly, though, the systems level approach gives modelers some flexibility with regard to "axiomatizing" or "black-boxing" elements at the underlying systems level to the one at hand, thereby simplifying the model and rendering its behavior more understandable. This approach goes back as far as von Neumann's earliest thoughts on how to model the nervous system [21].

Which Neural Circuitry Software?
Once a decision is at least tentatively made on the systems level and extent of detail, a modeling tool is selected that is designed to capture the target level of detail [22].
By way of example, our connectome models have used the following tools, each designed to model a different neural systems level and to be computationally efficient for the required tasks: IL, USA) of fiber threshold changes due to cathodic and anodic stimulation phases to elucidate a hypothesis on traditional low-frequency, "burst," and high-frequency spinal cord stimulation [28,29] In recent years, neural circuitry software is often categorized into three different model paradigms, all using coupled differential equations. The kind of considerations used to select which paradigm is appropriate for modeling the basal ganglia in Parkinson's disease, for instance, is described by Rubin [30]: 1. Activity-based or firing-rate models 2. Integrate-and-fire models 3. Conductance-based models An older tool, NEURON, has often been used for small neural circuits as well as individual neuron and axon behavior and offers the advantage of transparencyfor those who can march up the learning curve required to learn its interface [31].
However, as the need to model larger connectomes and longer time series is growing rapidly, newer methods are being developed [32][33][34][35][36]. Other driving forces Fig. 4 Rheobase, the minimum threshold of a nerve fiber under sustained stimulation, and chronaxie, the time constant of the fiber standardized at twice the rheobase stimulus strength, together determine the fiber threshold under stimuli of any length in passive fiber modeling. Rheobase may be relative to a given empirical model, or absolute, measured by replicating a model in finite element software and measuring electric field gradient along a virtual fiber [26] of high efficiency model paradigms are "biomimetic" or "bio-inspired" systems and the prospect of interfacing neural prostheses with biological neural circuits [37][38][39][40]. Such paradigms seek to marry software and hardware requirements ("algorithmhardware co-design") to produce ultimate computational efficiency [41,42]. These newer approaches are likely to obsolete the current methods within a decade.

Initial Conditions
There are no firm guidelines for setting up the initial conditions for a neural circuit, which remains more of an art than a science. One can start with randomized connectivity and connection strength, for instance, the absence of any knowledge of specific connectivity and connection strength, i.e., a high entropy configuration, and impose calibrations that constrain the circuit, reducing its entropy, until it sends its "message," i.e., behaves according to a desired specification.
An example of a "middle ground" initial conditions approach would be specifying coarse-grained connectivity that is known from the literature and using heuristics to ensure some reasonable performance or more likely avoid unreasonable performance. One such heuristic is setting connection strength from inhibitory to excitatory groups and excitatory to inhibitory groups incrementally stronger than from excitatory to excitatory groups and from inhibitory to inhibitory groups, such that the circuit does not initially slip into hyperactivity. A similar heuristic is lowering the ratio of excitatory to inhibitory connections such that stability is initially generated. A third technique is imposing a hypothesized exogenous "black box" inhibitory input to stabilize the circuit. Any highly recurrent circuit topology requires stabilizing, negative-feedback heuristics of this genre [6]. We used the first and third techniques in our human spinal cord connectome, in which the exogenous black box represented known but unquantified descending inhibitory control from higher brain centers.
Our human spinal cord connectome model entailed an ultimate attempt to perform a bottom-up calibration by culling from the literature all topology including source peripheral fibers or neural groups, their targets (specific Rexed laminae, i.e., the gray matter of the spinal cord), connection strength, and neurotransmitters [6]. The end result was that the overall model was under-constrained due to the dearth of complete data, in particular on connection strengths. Our conclusion was twofold. First, the model was valuable as a starting template for calibrating the innumerable local circuity models that embody the spinal cord connectome. Second, while advances in imaging techniques foreshadowed the possibility of complete bottom-up calibration at some point in the future, for practical purposes, top-down calibration, e.g., by replicating circuit behavior (input-output specifications) would be required in connectome modeling.

Calibration and Validation
The general calibration/validation procedure is to cull a set of calibrations for a given circuit from the literature and add calibrations one by one until the model predicts one or more of the remaining calibrations with reasonable accuracy, i.e., is validated against the remaining empirical criteria. Calibration can be done iteratively by hand, which can be painstakingly slow and tedious, but quite educational as to how neural circuitry behaves in general, how the specific software implementing the circuitry behaves, and how the specific circuit and its components affect each other.
On the other hand, calibration of a neural circuit can be performed automatically by an optimization program using one or more objective functions and error tolerance in the objective criteria as terminating conditions of a "while" loop. Such automated batch processing may be the only expeditious way to program evenly a moderately sized circuit.
As stated, rather than using bottom-up calibration, the field is moving toward top-down calibration utilizing a variety of empirical techniques (e.g., imaging techniques, electro-and magnetoencephalography), which, in conjunction, constrain models more than what has been possible to date, and notably in diseased state modeling [43,44]. The lower-level connectivity, connection strength, variety of individual neural processing, and axonal delays are reflected in the emergent highlevel behavior [45,46].
For example, to uncover connectivity and connection strength, fiber-tracing studies and identifying where on the dendritic tree an axon synapsed have been replaced with diffusion-weighted magnetic resonance image (MRI) and blood-oxygen-level-dependent (BOLD) data from which the "functional" connectivity of the network is inferred [47,48]. In a rapidly evolving paradigm, variations on this technique are utilized with older methods to offer the modeler a menu of possible calibration/validation datasets (Tables 1, 2, 3, and 4). Disease "signatures" (biomarkers) in the new paradigms can be used to calibrate models of diseased vs. healthy states and measure the effects of neuromodulation, electroceutical, or pharmaceutical techniques to restore the healthy state [24,[49][50][51][52]. Table 1 Evolving methods of top-down calibration [43] Resting-state functional connectivity network behavior Dynamic functional connectivity network behavior Resting-state fMRI oscillations Brain rhythm relationships (e.g., inverse α-rhythms) Excitation-inhibition balance Spike-firing patterns and fMRI on short-and long-time scales fMRI power-law scaling fMRI functional magnetic resonance imaging

The Functional Requirements
At a high level, the following are required to efficiently make valuable connectome models: 1. User-friendly modeling software for the correct systems level 2. Standardized parts list (axon and neuron types) Table 2 Type of modeling software used for the relevant neural systems level and time scale   Model software  Time scale  Typical application  Molecular dynamics 10 À12 -10 À9 Fine-grained ion channel dynamics Active fiber 10 À6 -10 0 Nerve fiber activation and blocking thresholds detailed at μs/ms ion-channel level Passive fiber 10 À3 -10 3 Fast, black-box nerve fiber activation and blocking thresholds Neural circuitry a 10 1 -10 3+ Resting state and dynamic functional connectivity of normal, disease, and neuromodulated circuits a For more details on neural circuitry software types, see Sect. 2.6, "Which Neural Circuitry Software? )] Import user-specific waveform Import user-specific voltage-current (current-voltage) curve for neuron types Import calibration-validation connectivity data Table 4 Idealized requirements for connectome models designed to inform medical device and drug development Peripheral fiber groups involved, their receptor type, diameter, conduction speed, numbers, and activation and blocking thresholds Source and target neuron groups, their neurotransmitters and receptors, known connectivity, and axonal delays Resting state, dynamic state, and their variational methods of measuring functional connectivity in healthy and diseased circuits Tissue parameters: geometry, conductivity, permittivity, and permeability Medical device parameters: geometrical electrode array and waveform characteristics including anodic, cathodic, and rest phase pulse width and shape, frequency, and duty cycle Drug effects on neural system targets such as peripheral fiber types and central neural groups 3. Assembly instructions (topography and weights) 4. Methods of calibration, validation, and their associated datasets to use in those processes

Conclusion
The "connectome" has come of age and is now in a similar stage as the genome was 15 years ago. The great advances and benefits loom ahead. Yet the functional requirements to build connectomes are now known. Connectome simulation is fast and cheap compared to hardware prototyping, or in vitro and in vivo investigation of how the neural system works. Connectome modeling of neuromodulation and electroceutical action on the nervous system will lead to an explosion of targets and means for greater control over disease and disorder treatment and enhanced neural behavior.
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made. The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.