Abstract
We introduce a method for systematically reducing the dimension of biophysically realistic neuron models with stochastic ion channels exploiting timescales separation. Based on a combination of singular perturbation methods for kinetic Markov schemes with some recent mathematical developments of the averaging method, the techniques are general and applicable to a large class of models. As an example, we derive and analyze reductions of different stochastic versions of the Hodgkin Huxley (HH) model, leading to distinct reduced models. The bifurcation analysis of one of the reduced models with the number of channels as a parameter provides new insights into some features of noisy discharge patterns, such as the bimodality of interspike intervals distribution. Our analysis of the stochastic HH model shows that, besides being a method to reduce the number of variables of neuronal models, our reduction scheme is a powerful method for gaining understanding on the impact of fluctuations due to finite size effects on the dynamics of slow fast systems. Our analysis of the reduced model reveals that decreasing the number of sodium channels in the HH model leads to a transition in the dynamics reminiscent of the Hopf bifurcation and that this transition accounts for changes in characteristics of the spike train generated by the model. Finally, we also examine the impact of these results on neuronal coding, notably, reliability of discharge times and spike latency, showing that reducing the number of channels can enhance discharge time reliability in response to weak inputs and that this phenomenon can be accounted for through the analysis of the reduced model.
Similar content being viewed by others
References
Arnold, V. I. (1983). Geometric methods for ordinary differential equations. New York: Springer.
Cecchi, G. A., Sigman, M., Alonso, J. M., Martinez, L., Chialvo, D. R., & Magnasco, M. O. (2000). Noise in neurons is message dependent. Proceedings of the National Academy of Sciences, 97(10), 5557–5561.
Chow, C. C., & White, J. A. (1996). Spontaneous action potentials due to channel fluctuations. Biophysical Journal, 71(6), 3013–3021.
Faggionato, A., Gabrielli, D., & Crivellari M. R. (2010). Averaging and large deviation principles for fullycoupled piecewise deterministic Markov processes and applications to molecular motors. Markov Processes and Related Fields, 16(3), 497–548.
Gillespie, D. T. (1977). Exact stochastic simulation of coupled chemical reactions. The Journal of Physical Chemistry, 81(25), 2340–2361.
Hille, B. (2001). Ion channels of excitable membranes. Massachusettes: Sinauer Sunderland.
Hodgkin, A. L., & Huxley, A. F. (1952). A quantitative description of membrane current and its application to conduction and excitation in nerve. Journal of Physiology, 117(4), 500–544.
Holden, A. V., Muhamad, M. A., & Schierwagen, A. K. (1985). Repolarizing currents and periodic activity in nerve membrane. Journal of Theoretical Neurobiology, 4, 61–71.
Izhikevich, E. M. (2007). Dynamical systems in neuroscience: The geometry of excitability and bursting. Cambridge: MIT Press.
Jung, P., & Shuai, J. W. (2001). Optimal sizes of ion channel clusters. Europhysics Letters, 56, 29–35.
Keener, J. (2009). Invariant manifold reductions for Markovian ion channel dynamics. Journal of Mathematical Biology, 58(3), 447–57.
Kepler, T. B., Abbott, L. F., & Marder, E. (1992). Reduction of conductancebased neuron models. Biological Cybernetics, 66, 381–387.
Mastny, E. A., Haseltine, E. L., & Rawlings, J. B. (2007). Two classes of quasisteadystate model reductions for stochastic kinetics. The Journal of Chemical Physics, 127, 094106.
Pakdaman, K., Tanabe, S., & Shimokawa, T. (2001). Coherence resonance and discharge time reliability in neurons and neuronal models. Neural Networks, 14(6–7), 895–90.
Pakdaman, K., Thieullen, M., & Wainrib, G. (2010). Fluid limit theorems for stochastic hybrid systems with application to neuron models. Advances in Applied Probability, 42(3), 761–794.
Rinzel, J. (1985). Excitation dynamics: Insights from simplified membrane models. Feredation Proceedings, 44, 2944–2946.
Rowat, P. (2007). Interspike interval statistics in the stochastic Hodgkin–Huxley model: Coexistence of gamma frequency bursts and highly irregular firing. Neural Computation, 19(5), 1215.
Rubin, J., & Wechselberger, M. (2007). Giant squid—hidden canard: The 3d geometry of the Hodgkin Huxley model. Biological Cybernetics, 97, 5–32.
Schmid, G., Goychuk, I., & Hänggi, P. (2001). Stochastic resonance as a collective property of ion channel assemblies. Europhysics Letters, 56, 22–28.
Schmid, G., Goychuk, I., & Hänggi, P. (2003). Channel noise and synchronization in excitable membranes. Physica A: Statistical Mechanics and its Applications, 325(1–2), 165–175.
Schneidmann, E., Freedman, B., & Segev, I. (1998). Ion channel stochasticity may be critical in determining the reliability and precision of spike timing. Neural Computation, 10, 1679–1703.
Shuai, J. W., & Jung, P. (2003). Optimal ion channel clustering for intracellular calcium signaling. Proceedings of the National Academy of Sciences, 100(2), 506–512.
Shuai, J. W., & Jung, P. (2005). Entropically enhanced excitability in small systems. Physical Review Letters, 95(11), 114501.
Skaugen, E., & Walloe, L. (1979). Firing behavior in a stochastic nerve membrane model based upon the Hodgkin–Huxley equations. Acta Physiologica Scandinavica, 107(4), 343–63.
Steinmetz, P. N., Manwani, A., Koch, C., London, M., & Segev, I. (2000). Subthreshold voltage noise due to channel fluctuations in active neuronal membranes. Journal of Computational Neuroscience, 9(16), 133–148.
Suckley, R., & Biktashev, V. N. (2003). Comparison of asymptotics of heart and nerve excitability. Physical Review E, 68, 011902, 1–15.
Takahata, T., Tanabe, S., & Pakdaman, K. (2002). White noise stimulation of the Hodgkin–Huxley model. Biological Cybernetics, 86, 403–417.
Tanabe, S., & Pakdaman, K. (2001). Noiseinduced transition in excitable neuron models. Biological Cybernetics 85, 269–280.
Tanabe, S., & Pakdaman, K. (2001). Noiseenhanced neuronal reliability. Physical Review E, 64, 041904.
Tanabe, S., & Pakdaman, K. (2001). Dynamics of moments of Fitz–Hugh–Nagumo neuronal models and stochastic bifurcations. Physical Review E, 63, 031911.
Tanabe, S., Sato, S., & Pakdaman, K. (1999). Response of an ensemble of noisy neuron models to a single input. Physical Review E, 60, 7235–7238.
Tateno, T., & Pakdaman, K. (2004). Random dynamics of the Morris–Lecar neural model. Chaos, 14, 511–530.
Wainrib, G., Thieullen, M., & Pakdaman, K. (2010). Intrinsic variability of latency to first spike. Biological Cybernetics, 103(1), 43–56.
White, J. A., Rubinstein, J. T., & Kay, A. R. (2000). Channel noise in neurons. Trends in Neurosciences, 23(3), 131–137.
Yin, G. G., & Zhang, Q. (1998). Continuoustime Markov chains and applications: A singular perturbation approach. New York: Springer.
Author information
Authors and Affiliations
Corresponding author
Additional information
Action Editor: Alain Destexhe
This work has been supported by the Agence Nationale de la Recherche through the ANR project MANDy “Mathematical Analysis of Neuronal Dynamics” ANR09BLAN0008.
Appendices
Appendix A: Auxiliary functions
The transition rates for the HH model are given by:
Accordingly, τ _{ x }(V) and x _{ ∞ }(V), for x = m, n, h, are given by: τ _{ x }(V) = 1/(α _{ x }(V) + β _{ x }(V)) and x _{ ∞ }(V) = α _{ x }(V)/(α _{ x }(V) + β _{ x }(V)).
Appendix B: Diffusion term in the twostate model
The purpose of this appendix is to give more details about the computation of the diffusion term in the twostate model. The computation is more complicated than in the multistate model because of the nonlinearity m ^{3}. Indeed, in the multistate case, the vector field F _{ MS } is linear with respect to u _{ Na } and one only needs to compute the corrective diffusion for a single channel, and then divide it by \(\sqrt{N_{Na}}\), since the average of N _{ Na } independent Brownian motions is equal in law to a single Brownian motion divided by \(\sqrt{N_{Na}}\). In the nonlinear case (twostate model), one needs to work out the computation directly on the process describing the empirical measure:

1.
First, one has to compute the law at time t of the empirical measure, with V fixed, that is the probability \(P_N^V(j,j_0,t)\) of having j open channels at time t starting from a population with j _{0} open channels.
Denote N = N _{ m }. Starting from a proportion of open m gates \(u_N(0)=\frac{j_0}{N}\) at t = 0, with 0 ≤ j _{0} ≤ N, one can show that the proportion of open gates u _{ N }(t) at time t follows a bibinomial distribution:
$$\begin{array}{rll} && P_N^V(j,j_0,t)\\ &&:=\mathbf{P}\left[u_N(t)=\frac{j}{N}u_N(0)=\frac{j_0}{N}\right]\\ &&\displaystyle{\sum_{x=\max(0,jN+j_0)}^{\min(j,j_0)}}\mu_{x,j_0}\left(p_0(t)\right)\mu_{jx,Nj_0}\left(p_1(t)\right) \end{array}$$with \(\mu_{i,j}(p)=C_{j}^i p^{i}(1p)^{ji}\) and where p _{0}(t) and p _{1}(t) are the solution of
$$ \dot{y}=(1y)\alpha_m(V)y\beta_m(V) $$(22)with respective initial conditions p _{0}(0) = 0 and p _{1}(0) = 1. Defining
$$ \tau_m:= \alpha_m(V)+\beta_m(V) \mbox{ and } m_{\infty}(V):=\frac{\alpha(V)}{\tau_y(V)} $$(23)the variation of constant gives: p _{0}(t) = m _{ ∞ }(V)\(\left(1e^ {t/\tau_m(V)}\right)\) and \(p_1(t)=e^ {t/\tau_m(V)}+p_0(t)\). If j _{0} = 0, then one retrieves the classic binomial distribution with parameters (N,p _{0}(t)): \(P_N^V(j,0,t)=C_N^j p_0(t)^j (1p_0(t))^{Nj}\). When t → ∞, one checks that the quasistationary distribution
$$ \rho_N^V(j)=\displaystyle{\lim\limits_{t\to \infty}} P_N^V(j,j_0,t) $$does not depend on j _{0} and is given by:
$$ \rho_N^V(j)= \mu_{j,N}(m_{\infty}) = C_N^j m_{\infty}(V)^j (1m_{\infty}(V))^{Nj} $$ 
2.
Then, integrating with respect to time the difference between \(P_N^V(j,i,t)\rho_N^V(j)\), we can express the quantity R(i,j) required to compute the diffusion term. Introducing K = (1 − m _{ ∞ }(V))/ (m _{ ∞ }(V)), and making the change of variable \(z=e^{t/\tau_m(V)}\), the computation boils down to integrals of the form:
$$ \int_0^1 (1  z)^a (1 + Kz)^b (1 + z/K)^c dz $$Using a formal computation software, R(i, j) can be expressed as a sum involving Appell F1 functions, defined by
$$ F_1(a,b_1,b_2,c;x,y) = \sum\limits_{m,n=0}^\infty \frac{(a)_{m+n} (b_1)_m (b_2)_n} {(c)_{m+n} \,m! \,n!} \,x^m y^n $$with (q)_{ n } = q (q + 1) ⋯ (q + n − 1). Then we can write:
$$\begin{array}{rll} R(i,j)&=&\tau_m\sum\limits_{x=\max(0,jN+i)}^{\min(i,j)} C_i^x C_{Ni}^{jx}Y_{\infty}^{(N)}(x,i,j)\\ &&\times \left(H_K^{(N)}(x,i,j)F_1^{(N)}(x,i,j)1\right) \end{array}$$with:
$$\begin{array}{rll} Y_{\infty}^{(N)}(x,i,j)&:=& m_{\infty}^x m_{\infty}^{jx} (1m_{\infty})^{ix}\\ &&\times \ (1m_{\infty})^{Nij+x}\\ H_{K}^{(N)}(x,i,j)&:=&\frac{K^{xi}(1+K)^{i+j2x}}{1+2xNij}\\ F_1^{(N)}(x,i,j)&:=& F_1\left(w,xi,xj,w+1,\right.\\ &&\qquad\left.\frac{1}{1+K},\frac{K}{1+K}\right)\\ w&:=&1+2x+Nij \end{array}$$ 
3.
The complete expression for the variance is then given by plugging the above expression for R(i, j) into formula (20).
Rights and permissions
About this article
Cite this article
Wainrib, G., Thieullen, M. & Pakdaman, K. Reduction of stochastic conductancebased neuron models with timescales separation. J Comput Neurosci 32, 327–346 (2012). https://doi.org/10.1007/s1082701103557
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s1082701103557