Abstract
Motivated by experimental observations, we investigate a variant of the cocktail party problem: the detection of a weak periodic stimulus in the presence of fluctuations and another periodic stimulus which is stronger than the periodic signal to be detected. Specifically, we study the response of a population of stochastic leaky integrate-and-fire (LIF) neurons to two periodic signals and focus in particular on the question, whether the presence of one of the stimuli can be detected from the population activity. As a detection criterion, we use a simple threshold-crossing of the population activity over a certain time window. We show by means of the receiver operating characteristics (ROC) that the detectability depends only weakly on the time window of observation but rather strongly on the stimulus amplitude. Counterintuitively, the detection of the weak periodic signal can be facilitated by the presence of a strong periodic input current depending on the frequencies of the two signals and on the dynamical regime in which the neurons operate. Beside numerical simulations of the model, we present an analytical approximation for the ROC curve that is based on the weakly nonlinear response theory for a stochastic LIF neuron.
Graphic abstract
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
The detection of a weak signal in the presence of a much stronger signal is an interesting problem that arises in several natural situations for living organisms, most prominently in auditory perception where it is known as the cocktail party problem [1]. Detection is complicated by nonlinearity in the sensory apparatus (see e.g. [2]) and by different noise sources [3], so studying very simple models can help us to better understand this problem.
The basic problem has been thoroughly investigated in the stochastic processes community with respect to beneficial effects that noise can have in nonlinear dynamical systems. It is well-known that the detection and transmission of a weak signal can be improved by a finite amount of fluctuations by the mechanism of stochastic resonance as reviewed and critically discussed in Refs. [4,5,6,7,8]. Different stochastic resonance effects have been shown to occur in theoretical models like point processes [9, 10], simple threshold systems [11,12,13,14,15], bistable [16,17,18,19,20,21,22,23,24], and excitable systems [25,26,27,28,29,30]. This theoretical research was stimulated by experimental demonstrations of stochastic resonance in electronic circuits [31], laser systems [32], ion channels in biological membranes [33], neural receptors [34,35,36], and spatially extended neural systems [37, 38] to name but a few examples. More recent contributions in the context of neuroscience discuss in particular the role of the intrinsic noise in neural systems [39,40,41,42,43] and their interplay with abundant mechanisms of adaptation in neural systems [44,45,46]. Generally, how neural populations of spiking neurons respond to time-dependent stimuli has been addressed with different theoretical approaches [43, 47,48,49,50,51,52].
An intriguing example of a signal detection task can be found in the courtship behavior of weakly electric fish. It has been observed that a resident male is able to detect a distant male intruder while courting a female [53]. This represents an instance of the aforementioned cocktail party problem: a comparatively weak low-frequency signal (the distant intruder) has to be detected in the presence of another time-dependent input, coming from the nearby female, a strong high-frequency stimulus. The detection of this faint signal is a fascinating problem that involves many levels of neural processing and is additionally complicated by the movements of the participating fish [53], by the change of their frequencies (known as jamming avoidance response) [54], and by different adaptation mechanisms (starting with a pronounced spike-frequency adaptation in the sensory receptor cells, the P-units) [55].
Here, we take the specific experimental observation of an intruder detection as an inspiration to study a generic detection problem of how one periodic signal can be detected in the presence of another one by a population of stochastically spiking neurons. The generic scheme may also be applicable to other sensory modalities in which concurring periodic signals are present (the sense of hearing would be an obvious case). Our interest lies in the first stage of signal processing (neglecting possible filters or optimized detectors on higher levels), where we analyze how different time-dependent stimuli are present in the output of uncoupled noisy cells. We would like to stress that our simple detection scheme is not supposed to account for the observed detection performance in weakly electric fish. (None of the above-mentioned complications is taken into account.) However, to the best of our knowledge, even this simple model, that allows for some analytical insights, has not been studied yet. The key questions that we will be interested in here are: Under which conditions is the periodic signal easier to detect—in the presence or the absence of the second (strong periodic) background stimulus? Which role is played by the stochastic firing regime of the neurons in the population (mean- or fluctuation-driven regime) and by the response regime (linear or weakly nonlinear)? Are there specific frequency combinations of the two stimuli that make the weaker signal better detectable?
Assuming a simple detection scheme based on spike counts, we want to explore the roles of linear and nonlinear responses in the stochastic dynamics of the single cell. We hypothesize that the weakly nonlinear response may play a beneficial role in the detection task and that also the presence of the strong periodic background stimulus (i.e., the female courtship input in the above example of weakly electric fish) does not have to be necessarily detrimental for the detection but, on the contrary, may facilitate the detection. We would like to emphasize that this beneficial effect is unrelated to the vibrational resonance effect emerging in systems driven by multiple periodic signals (see discussion below).
Our paper is organized as follows: First, we introduce the model and sketch how we perform the measurement process and how the detector functions. We then present our analytical approximation of the receiver operating characteristic (ROC) which quantifies the detection performance. We investigate the influence of variations of the simulation and detection parameters in the detectability of the periodic signal and focus in particular on the detection time window, the strength of the signal amplitude, and the frequency combinations. We consider two operating regimes of the neuron model, the mean-driven and the excitable regimes and ask in all situations whether the presence of a strong periodic background stimulus can be beneficial for the detection task.
2 Model and methods
2.1 Population model and single-neuron model
We consider a population of spiking neurons that are not connected but driven by a common periodic signal and individual noise. In the example of weakly electric fish, the neurons would correspond to the P-units in the electric fish and the periodic input signal would contain components stemming from the nearby female fish and from the intruder to be detected. The scenario of a population of uncoupled noisy neurons that transmit information of periodic input stimuli is, however, more general and for instance also encountered in the auditory periphery.
The dynamics of the i-th spiking neuron is given by a leaky integrate-and-fire (LIF) model driven by an external signal s(t):
Here, \(v_{i}(t)\) is the membrane voltage, \(\mu \) is the mean input current, \(\xi _{i}(t)\) is white Gaussian noise with zero mean \(\left\langle \xi _{i}(t) \right\rangle = 0\) and correlation function \(\left\langle \xi _{i}(t)\xi _{i}(t') \right\rangle = \delta (t-t')\) and D is the noise intensity. Whenever \(v_{i}(t)\) hits the threshold \(v_{T}\), a spike is registered for that time and \(v_{i}(t)\) is reset to \(v_{R}\). In our non-dimensional model, time is measured in units of the membrane time constant, the voltage in multiples of threshold-reset difference, and we set \(v_{R} = 0\) and \(v_{T} = 1\) (cf. [56]). In our setup, the sensory stimulus, common to all \(N_{\textrm{pop}}\) units in the population, reads
It is given by the sum of two cosine functions with different frequencies \(\omega _{s,b} = 2\pi f_{s,b}\) (we will use both the regular frequencies \(f_{s,b}\) and circular frequencies \(\omega _{s,b}\)), relative amplitudes \(a_{s,b}\) and phase offsets \(\varphi _{s,b}\); the total signal s(t) enters the dynamics scaled by a global amplitude \(\varepsilon \). The two terms represent the total stimulus, consisting of a strong background stimulus \(a_{b} \cos (\omega _{b} t + \varphi _{b})\) and the weak signal \(a_{s} \cos (\omega _{s} t + \varphi _{s})\), the presence of which has to be detected from the output of the population. We will consider situations in which the weak signal is absent (\(a_{s} = 0\)) or present (\(a_{s} > 0\)) and will ask how the presence of this signal can be detected. We will also inspect how the detectability of the signal depends on the presence (\(a_{b} > 0\)) or absence (\(a_{b} = 0\)) of the background periodic stimulus.
For the numerical simulations of Eq. (1), we use the Euler–Maruyama method, operating in discrete time steps \(t = t_{0} + \ell \Delta t\)
Here, \(\vartheta _{\ell }\) are independent Gaussian numbers drawn from a normal distribution with zero mean \(\langle \vartheta _{\ell }\rangle = 0\) and unit variance \(\left\langle \vartheta _{\ell } \vartheta _{k} \right\rangle = \delta _{\ell k}\). For all numerical examples in this paper, we use an integration time step of \(\Delta t = 10^{-3}\).
We consider a neuron population of \(N_{\textrm{pop}}= 10^{3}\) in all simulations. We analyze the time-dependent count statistics of the population for the different signal combinations in order to test a specific idea how the presence of the weak periodic signal may be detected (for details see below). To determine the time-dependent spike count N(t), we discretize the time axis into bins \(\Delta t_{\textrm{bin}}= 0.05\) (significantly larger than our integration time step) and count all the spikes fired by all neurons within this bin:
Dividing this by the size of \(\Delta t_{\textrm{bin}}\) and an additional trial average yields an approximation of the instantaneous firing rate
where \(\left\langle \cdot \right\rangle _{\xi _{i}}\) indicates an ensemble average over realizations of the intrinsic noise \(\xi _{i}(t)\) (the common signal is always the same in all these realizations).
2.2 Detection process
Our approach here is similar in spirit to the recent study by Bernardi and Lindner [57] of the detection of a static signal embedded in an Ornstein–Uhlenbeck process; however, there are also important differences (see below).
Specifically, we will analyze two time series for different experiments, i.e., spike count modulations N(t) in the presence or absence of the signal. We will carry out this numerical experiment for the two distinct situations when the strong periodic background signal is present (\(a_{b} = 1\) in Eq. (2)) or not (\(a_{b} = 0\) in Eq. (2)). We measure the time-dependent counts in a very long time window that is split into \(N_{T}\) smaller detection windows \(T_{j}\) of length \(T=K\Delta t_{\textrm{bin}}\) and short pauses of length \(\Delta t_{\text {off}}\) cf (Fig. 1). The \(N_{T}\) time windows serve as trials—this is somewhat different to the procedure in [57], where trials result from the repetition of the same experiment. At the same time the averaging over subsequent time windows implies an automatic averaging over the initial phases of the periodic signals (as long as the time window is not a multiple of one the driving signal’s period, a non-generic case that we exclude in the following by \(\Delta t_{\text {off}}\)).
We assume that the detection of an event takes place whenever the spike count of the population crosses a threshold \(\theta \) (green dashed horizontal line in Fig. 1). In two distinct numerical simulations of our population model, the count in the presence (blue) and in the absence of the signal (red) is measured, respectively. When the blue time series crosses \(\theta \) at least once within the corresponding time window, a correct-detection is registered for that trial. In analogy to this, a false-positive event is recorded when the red time series exceeds the threshold at least once. The correct detection (CD) and false-positive (FP) rates are then obtained by averaging over all \(N_{T}\) trials. Varying the threshold yields the two rates as functions of \(\theta \).
In Fig. 2 a few examples for the FP and CD rates vs threshold \(\theta \) for different values of the signal amplitude \(a_{s}\) are shown in (a) together with the corresponding ROC curves in (b). The latter are obtained by plotting the CD rate as a function of the FP rate. A low \(\theta \) corresponds to a very high detector sensitivity and is indicated by the upper right corner in Fig. 2b, and a high \(\theta \) is represented by the lower left range. The example curves are taken for \(a_{b} = 1\), which means in the presence of the strong periodic background stimulus.
We first focus on the CD (blue) and FP (red) rates as functions of the threshold \(\theta \) in Fig. 2a. At small \(\theta \), the probability of the measured spike count N(t) to be above the threshold at least once within the time window is close to one and decreases with increasing \(\theta \) toward zero. For a weak signal, \(a_{s} = 0.20\), the CD and FP rates are very close to each other implying that the detection of the signal is difficult. Boosting the signal amplitude leads to a higher spike count, thus the CD rate starts to decrease for higher threshold values, while the FP rate remains the same. As a consequence, the horizontal distance between the CD and FP rates is increasing for higher signal amplitudes and detection becomes a simpler task as can be expected. The improved detectability for larger \(a_{s}\) is also apparent in the ROC curves in Fig. 2b. For a weak signal amplitude (\(a_{s} = 0.2\), green), the distance to the diagonal (gray solid line, representing chance level) is small. Enlarging the signal amplitude yields a growing distance from the diagonal. In the present examples, the ROC curve of the signal amplitude \(a_{s} = 0.8\) (pink) is already close to that of an ideal detector, i.e. a detector with a \(100\%\) CD rate at all FP rates.
2.3 Analytical approximation of the ROC
To derive an analytical description of the ROC curve, we make two assumptions for the count in a short time bin (\(\Delta t_{\textrm{bin}}\ll \langle I\rangle \), where I denotes the interspike interval of a single stochastic LIF neuron):
-
1.
The spike count distribution follows a Poisson distribution \(P_{\langle N_{k}\rangle } (N)\);
-
2.
spike counts in each time bin are independent.
The first assumption is well-justified for a sufficiently small time bin (see e.g. [58]). The second assumption is certainly an approximation, as the sum of independent non-Poissonian spike trains do not converge to a Poisson process [59]. Assuming a Poisson count statistics has the great advantage that we just need to know the mean spike count \(\langle N_{k}\rangle \) to determine the distribution completely. The basic idea is identical to the procedure in the numerical case: We want to estimate the probability to be at least once in the detection window above a certain threshold \(\theta \).
We first determine the probability for the spike count N in a time bin \(\Delta t_{\textrm{bin}}\) to be below or at most at the threshold \(\theta \). This is given by the sum over all possible values of \(N \in {\mathbb {N}}\) up to \(\lfloor \theta \rfloor \), the largest integer smaller than or equal to \(\theta \):
In the last line, we used the incomplete Gamma function \(\Gamma (a,x) = \int _{x}^{\infty } \text {d} t \ t^{a-1} e^{-t}\) [60] and furthermore provided in the last step an approximate expression that interpolates between integer values of \(\theta \) (at the latter, the two expressions coincide). In the following, we will use for simplicity only the latter expression.
To estimate the probability \(p(\theta , T)\) to be not even once above \(\theta \) in the j-th detection window \(T_{j}\), we multiply the probabilities Eq. (6) from all bins, exploiting the assumption of statistical independence that holds true for a Poisson process:
Here, we have used the population spike counts at times \(t_{j,k} = j(T + \Delta t_{\text {off}})+k\Delta t_{\textrm{bin}}\).
The FP \({\mathcal {X}}(\theta , 0, T)\) and CD \({\mathcal {Y}}(\theta , a_{s}, T)\) rates are then given by
where we indicated the explicit dependence of the spike count on the signal amplitude by a parametric argument. Furthermore, the above formulas also include the trial average over the detection windows \(T_{j}\) (sum over j).
With the obtained expression, we can proceed in two different ways. Firstly, we can measure the time-dependent mean spike count in simulations and use these data in Eqs. (8) and (9). This will be referred to as a semi-analytical theory in the following as it still requires some numerical simulations. Secondly, we can approximate the mean spike count using linear and nonlinear response theory for stochastic integrate-and-fire neurons that are driven by periodic signals [61, 62]. In the latter case, we will use Eq. (5) to estimate the mean spike count via
The instantaneous firing rate in the weakly nonlinear regime (neglecting higher than second-order terms in \(\varepsilon \)) can be approximated by [62]
Here, we have included initial phases \(\varphi _{s,b}\) for both periodic signals. The indices in the above expression indicate distinct contributions to the response: the steady state with an index 0, the linear response (LR), the higher harmonics of the periodic driving (HH) and the mixed response (MR) that emerges because of the simultaneous presence of two signals (see [62] for further discussion). All the firing and response characteristics for the white-noise-driven LIF model, \(r_{0}, \chi _{1}(\omega ), \chi _{2}(\omega _{1},\omega _{2})\), are given in Appendix A.
The FP \({\mathcal {X}}_{\text {ana}}(\theta ,0,T)\) and CD \({\mathcal {Y}}_{\text {ana}}(\theta ,a_{s},T)\) rates for the analytical theory can then be expressed by
The FP rate in the absence of the strong periodic background stimulus (\(a_{b} = 0\)) can be simplified: In this case, we also have for the signal amplitude \(a_{s} = 0\), so the instantaneous firing rate in Eq. (11) reduces to \(r(t)\approx r_{0}\), and we obtain
3 Results
In the following, we investigate the signal detection task in two very different parameter regimes of the LIF neurons: neurons of the population are either mean-driven (\(\mu = 1.1 > v_{T}\), \(D=0.001\)) or in an excitable regime (\(\mu = 0.9 < v_{T}\), \(D = 0.005\)). We will see that the weakly nonlinear response will have a very different impact in the two regimes. We consider variations of the detection and signal parameters for both regimes. We are particularly interested in how the presence of the strong background stimulus affects the detectability of the weak signal. Simulations were performed for a total number of \(N_{\textrm{pop}}= 10^{3}\) LIF neurons, and to create the ROC curves, \(N_{T} = 10^{3}\) detection windows have been used.
3.1 Change of the detection time window
Firstly, we would like to study the impact of the detection time window T in the excitable regime. Figure 3 shows the ROC curves for a relatively weak signal (\(a_{s} = 0.2\)), in the presence (Fig. 3a, \(a_{b} = 1\)) and absence (Fig. 3b, \(a_{b} = 0\)) of the strong periodic background stimulus. Figure 3c, d show the differences of the CD and FP rates as a function of the FP rate, which is referred to as the effect size [63, 64]. The different values of T are given in multiples of the period of the signal \(T_{s}\) and are represented by different symbols and colors as indicated in the legend. The solid lines represent the analytical theory for the ROC curves (Eq. (13) plotted vs Eq. (12)). Three observations can be made. First of all, for the parameters chosen, the theory is in good agreement with the simulation results. Secondly, the effect of increasing the time window is very weak: a tenfold increase in the detection time window does not even lead to a doubling of the effect size, i.e. in the detectability of the signal. Thirdly, there is not much of a difference in the detectability introduced by the presence of the background stimulus.
Next, we look at the same statistics for a larger signal amplitude (cf. Fig. 4); the general effect of \(a_{s}\) will be inspected in the next subsection. The effect size is generally larger than before, i.e. the ROC curves are further away from the diagonal (see panels (a), (b)). The full analytical theory still works and the effect of enlarging the time window is still weak. If we compare the detection in the presence and in the absence of the background stimulus, we find that its presence can diminish the detection performance slightly. In panel (e) we show the FP and CD rates as functions of the threshold in the absence and the presence of the background stimulus for the detection window of \(T=5T_{s}\); the main effect of the background stimulus is to shift the rates to higher thresholds, which has no effect on the ROC curves.
We now turn to the mean-driven regime. The ROC curves and corresponding effect sizes in the presence and absence of the strong background stimulus for a signal amplitude of \(a_{s} = 0.2\) are shown in Fig. 5. In marked contrast to the excitable case, we find that the detector benefits from the presence of the background stimulus; there is a significant boost in the detectability of the weak periodic signal going from panel (d) to panel (c). Indeed, without the background stimulus, the ROC curves are practically on the diagonal and detection is nearly impossible.
Like in the excitable regime, the effect of enlarging the detection time window is weak. Furthermore, we notice that the analytical theory (solid lines) differs somewhat from the simulation results. This begs the question which part of our approximate calculation is responsible for this deviation. In order to answer this, we have determined by many simulations of the same periodic stimulus the time-dependent mean spike count that is the crucial input to the semi-analytical theory, Eq. (8), and Eq. (9). If we use these simulated mean count data (dashed lines in Fig. 5), the agreement is again very good. This means that the noticeable deviations of the full theory are due to the limitations of the weakly nonlinear response theory: taking only the terms up to the second-order in \(\varepsilon \) does not reproduce the firing rate correctly. To obtain a better approximation of the firing rate and therefore also a good agreement of the analytical theory with the simulation results, one has to take higher order terms into account (see [65] for such a computation).
Next, we look at the effect of using a larger signal amplitude in the mean-driven regime (cf. Fig. 6). As in the excitable regime, the effect size is generally larger than before. We again find that the analytical theory differs noticeable from the simulation results but that the use of the numerically determined mean spike counts restores a good agreement. We also note that in the absence of the background stimulus, our full analytical theory works well.
Additionally, we show in Fig. 6e the FP and CD rates as functions of the threshold. In contrast to the excitable case, in the presence of the background stimulus (\(a_{b} > 0\)), the two rates are clearly further apart which improves detectability.
3.2 Dependence on the signal amplitude
The detection problem is, obviously, particularly interesting for weak to moderate signal amplitudes, which we now explore in more detail. We again start with the excitable regime and consider in Fig. 7 the ROC curves for a range of amplitudes.
Referring to our three observations of the first subsection, we firstly notice that the full analytical theory is still in good agreement with our simulation results for all values of \(a_{s}\) in the presence and absence of the background stimulus. Secondly, increasing the signal amplitude has a strong effect on the detectability, which is not surprising. But in contrast to enlarging the detection time window, a doubling of the value of \(a_{s}\) may also lead to a doubling in the effect size in both cases (see panels (c) and (d)). Thirdly, for weak signal amplitudes (\(a_{s} \le 0.2\)) we observe again not much of a difference induced by the presence of the background stimulus. For relatively strong values of \(a_{s}\), i.e. \(a_{s} \ge 0.5\), the detectability of the signal seems to be better in the absence of the background stimulus, indicated by an up to \(10\%\) higher effect size.
In the mean-driven regime, cf. Fig. 8, we have a clear beneficial role of the background stimulus in the detection task. The deviations of the analytical theory to our simulation results in presence of the background stimulus are small and using the numerically determined mean spike count (semi-analytical approach) leads again to an excellent agreement.
How does the beneficial effect of the background stimulus come about? To answer this question, we take a closer look at the firing rate in Eq. (11), especially at the influence of response terms of the first (linear response) and second order (higher harmonics and mixed response). Figure 9 shows the numerically obtained firing rate with and without the signal, with and without the background stimulus and in both excitable and deterministic firing regimes. In all panels, we also plot the full response up to the second order of the firing rate (solid black lines) as well as the response of the firing rate up to the linear order (dashed green lines).
We find, at least for the chosen frequencies, that the nonlinear response leads to a modestly increased rate modulation in the excitable regime if both background stimulus and signal are present (see panel (a)), whereas the nonlinear response has little effect on the rate in the absence of the signal. In marked contrast to this, in the mean-driven regime, we observe a strong boost of the firing rate modulation by the nonlinear response (see the pronounced difference between the black and green lines in panel (e)). A stronger rate modulation in the presence of the signal and the background stimulus plausibly enhances the detectability of the signal. Closer inspection of the single contributions of the nonlinear response reveals that the mixed response (MR term in Eq. (11)) is responsible for the beneficial boost of the rate modulation.
3.3 Change of the frequencies of the periodic signals
So far, we have used a fixed combination of the frequencies of the two periodic signals. Now, we investigate how the detectability of the signal depends on these frequencies.
For a better visualization we use as a measure of detection performance the area under the curve (AUC): For each frequency combination, we measure the area enclosed by the ROC curve and the diagonal (c.f Fig. 10). If for instance the ROC curve falls on the diagonal, the performance measure is zero and detection is impossible. Systematic deviations from the diagonal indicate detectability beyond the chance level.
The AUC measure allows us to identify conditions under which the presence of the background stimulus is beneficial for the detection of the weak periodic signal. We investigate this first for a comparatively weak signal (\(a_{s} = 0.2\)) and in the excitable regime, see Fig. 11a–c. We subtract the performance in the absence of the background stimulus (panel (b)) from the performance in its presence (panel (a)); the difference is shown in panel (c). First of all, the detectability seems to be best for a slow signal (see reddish part on the left in panel (a)); generally, the detectability is not very high in the excitable case for all frequency combinations. It is somewhat better if the background stimulus is absent (panel (b)), and the difference between the two cases (panel (c)) is either close to zero or slightly negative (vertical stripes and scattered blue and red points in panel (c) are measurement artifacts). This coincides with the special case investigated above: for neurons in the excitable regime, the presence of a strong background stimulus is detrimental for the detection of a faint signal.
In the mean-driven regime (c.f Fig. 11d–f), the situation becomes more interesting. Turning first to the case where the background stimulus is absent (see panel (e)), we observe a nearly perfect detection of the signal for a frequency of \(f_{s} = 0.42\) and find a good performance for close-by frequencies (note the red vertical stripe around this frequency). The spontaneous firing rate of the neuron population with the used parameters (mean input \(\mu \) and noise intensity D, see Eq. (A2)) is \(r_{0} \approx 0.42\), so the detectability can be highly improved when the frequency of the driving signal either matches or is close to the spontaneous firing rate of the LIF neurons. Furthermore, the detection performance increases also for \(f_{s} \approx 2r_{0}\) but the effect is much weaker (note the fainter red vertical stripe around \(f_{s} = 2r_{0}\)).
In the presence of the background stimulus (panel (d)), we also have an excellent detection of the weak signal for a signal frequency of \(f_{s} \approx r_{0}\) and for almost all frequencies of the background stimulus (again, there is a pronounced vertical stripe around this frequency). However, much more striking is the improved detectability on certain diagonal lines on which either \(f_{s} + f_{b} \approx r_{0}\) or \(\left| f_{s} - f_{b}\right| \approx r_{0}\). These contributions arise due to the weakly nonlinear response of the neurons in the considered dynamical regime. The sum and difference of the two frequencies appear as contributions in the mixed response in Eq. (11) and lead to the beneficial role of the background stimulus in the detection of the faint signal, which becomes clear when considering the difference of the detectability measure in the presence and absence of the background stimulus (red areas in panel (f)). We note that in the motivating example of intruder detection in the courtship situation of weakly electric fish, the detection benefits for the behaviorally relevant frequency combinations, i.e. for \(f_{s} < f_{b}\) and for \(f_{s} \lessapprox r_{0}\)—conditions that are a consequence of the distinct distributions of self-generated frequencies for male and female weakly electric fish [53] (\(f_{s}\) and \(f_{b}\) are the beating frequencies, i.e. they result from the difference of self-generated frequencies of interacting fish).
Last we want to address the question, how the detection performance and the nonlinear response, especially the second-order susceptibility \(\chi _{2}(f_{1},f_{2})\), are related. For this purpose, we use the analytical expression for \(\chi _{2}(f_{1},f_{2})\) in combination with the expression for the firing rate modulation and our theory (Eq. (12) and Eq. (13)) to compute ROC curves for a broad range of frequency combinations \(f_{1}, f_{2}\). Figure 12 shows the absolute value of the second-order susceptibility (panel (a)) and the AUC measure of the analytically obtained ROC curves (panel (b)).
Firstly, we observe a very good agreement of the analytically determined AUC measure (Fig. 12b) with the numerically obtained one (Fig. 11d). The small deviations of our analytical theory from the simulation results in the ROC curves found in the previous sections result in slightly different AUC values for some frequency combinations; however, the general structure of the detection performance is the same. Secondly, the absolute value of \(\chi _{2}(f_{1}, f_{2})\) has some symmetry properties (for a detailed discussion, see [62]) which the AUC does not share due to the unequal roles of the two frequencies \(f_{s}\) and \(f_{b}\). Despite this lack of complete symmetry in the AUC, there are still similarities but also characteristic differences to \(\left| \chi _{2}(f_{1},f_{2})\right| \). The most prominent similarities are the strong maximum around \(f_{1} = r_{0}\) (\(f_{s} = r_{0}\)) and the other maximum around the antidiagonal line (highlighted by a green ellipse in Fig. 12 both in (a) and (b)) for which \(f_{1} + f_{2} = r_{0}\) (\(f_{s} + f_{b} = r_{0}\)). The most striking differences are that \(\left| \chi _2(f_{1}, f_{2})\right| \) is large around the horizontal line \(f_{2} = r_{0}\), where the AUC is small and that the AUC is large around the two diagonals on which \(\left| f_{s} - f_{b}\right| = r_{0}\) (one of them emphasized by a dashed turquoise ellipse in panel (b)) where the nonlinear response is very small (dashed blue ellipse in panel (a)). The difference between the AUC on the vertical line \(f_{s} = r_{0}\) and the horizontal line \(f_{b} = r_{0}\) is clear evidence of the unequal roles of the two frequencies. To understand why on the horizontal line the detection performance is not good it suffices to take into account that for \(f_{b}\) close to the \(r_{0}\), the firing rate is strongly modulated both in presence and absence of the signal, which makes it harder to detect a faint signal.
How can we explain the diagonal lines around \(\left| f_{s} - f_{b}\right| = r_{0}\) that appear in panel (b) but not in panel (a) for \(f_{1}, f_{2} > 0\) (\(f_{s}, f_{b} > 0\))? Turning once again to Eq. (11), we note that the mixed response consists of two parts: (i) a term proportional to \(\left| \chi _{2}(f_{s}, f_{b})\right| \)—this contribution is strong when the condition \(f_{s} + f_{b} \approx r_{0}\) is fulfilled (antidiagonal highlighted by the green ellipses); (ii) a term proportional to \(\left| \chi _{2}(f_{s}, -f_{b})\right| \). The latter is just the mirrored diagonal line around the \(f_{1}-\)axis of \(\left| \chi _{2}(f_{1}, f_{2})\right| \) (compare the dashed turquoise to the solid turquoise ellipse), where the absolute value of the second-order susceptibility is again comparably high. This leads to a strong mixed response not only for frequency combinations around \(f_{s} + f_{b} = r_{0}\) but also around \(\left| f_{s} - f_{b}\right| = r_{0} \) in the firing rate modulation and thus in the AUC.
4 Summary and conclusions
In this paper, we considered a homogeneous population of LIF neurons driven by two periodic stimuli, one playing the role of a strong periodic background and the other one that of a faint signal to be detected. We investigated the detection performance in the excitable and mean-driven regimes and varied the signal and detection parameters. We developed an analytical framework to calculate ROC curves for the population activity approximately and demonstrated that our formulas which use the linear and the weakly nonlinear response of the instantaneous firing rate work reasonably well.
In general, we found in both regimes that the effect of enlarging the detection time window is very weak, whereas increasing the signal amplitude leads to a strong improvement in the detectability of the faint signal. For specific frequency combinations and in the mean-driven regime, the detection benefits greatly from the presence of the background stimulus: the detectability of the weak signal increases for \(a_{b} > 0\). We showed that this effect can be traced back to the weakly nonlinear response and in particular to the so-called mixed response to two periodic signals. In marked contrast, in the excitable case, the presence of the background stimulus was only detrimental to the detection of a faint signal.
The beneficial effect of the background periodic stimulus bears some resemblance to vibrational resonance, an effect observed in bistable [66] and excitable systems [67]: a high-frequency (HF) signal can assist the transmission of a weak low-frequency signal. Apart from the occurrence of two periodic drivings, there are however important differences of our problem to the setting in which a vibrational resonance can be obtained. We consider the detection and not the transmission of the weak periodic signal and use all kinds of thresholds when calculating the ROC curve. Vibrational resonance (similar to stochastic resonance) is considered for systems with an inherent threshold, and the resonance effect relies on a suboptimal setting of this threshold. The beneficial effect that we have found is not due to such a suboptimal threshold but, as highlighted above, due to the weakly nonlinear response of the single units in our population model.
With respect to our motivating example, the intruder detection in a courtship situation of weakly electric fish, our modeling assumptions still have severe limitations. Firstly, we assumed a homogenous population of LIF neurons, although encoding populations in the sensory periphery display a pronounced heterogeneity with respect to mean activity (distributions of firing rates are broad) and to variability (also the CV is broadly distributed). For instance, the P-units in weakly electric fish possess firing rates between 50–400 Hz and CV’s between 0.2\(--\)0.9 [68]. Furthermore, many neurons in the sensory periphery display a pronounced firing rate adaptation [69, 70] that can be incorporated in the integrate-and-fire framework with additional variables (see e.g. [71,72,73,74,75]). It is unclear how neural heterogeneity and adaptation will affect the detectability of a weak stimulus. These are exciting problems for future research.
Data availability
The datasets generated and analyzed during the current study are available from the corresponding author on reasonable request.
References
J.H. McDermott, The cocktail party problem. Curr. Biol. 19, R1024–R1027 (2009)
P. Martin, A.J. Hudspeth, Compressive nonlinearity in the hair bundle’s active response to mechanical stimulation. Proc. Natl. Acad. Sci. U.S.A. 98, 14386 (2001)
A.A. Faisal, L.P.J. Selen, D.M. Wolpert, Noise in the nervous system. Nat. Rev. Neurosci. 9, 292 (2008)
L. Gammaitoni, P. Hänggi, P. Jung, F. Marchesoni, Stochastic resonance. Rev. Mod. Phys. 70, 223 (1998)
A. Neiman, A. Silchenko, V. Anishchenko, L. Schimansky-Geier, Stochastic resonance: noise-enhanced phase coherence. Phys. Rev. E 58, 7118 (1998)
T. Wellens, V. Shatokhin, A. Buchleitner, Stochastic resonance. Rep. Prog. Phys. 67, 45 (2004)
M.D. McDonnell, D. Abbott, What is stochastic resonance? Definitions, misconceptions, debates, and its relevance to biology. PLoS Comput. Biol. 5, e1000348 (2009)
M.D. McDonnell, L.M. Ward, The benefits of noise in neural systems: bridging theory and experiment. Nat. Rev. Neurosci. 12, 415 (2011)
S.M. Bezrukov, Stochastic resonance as an inherent property of rate-modulated random series of events. Phys. Lett. A 248, 29 (1998)
I. Goychuk, Information transfer with rate-modulated Poisson processes: a simple model for nonstationary stochastic resonance. Phys. Rev. E 64, 021909 (2001)
L. Gammaitoni, Stochastic resonance and the dithering effect in threshold physical systems. Phys. Rev. E 52, 4691 (1995)
F. Chapeau-Blondeau, Stochastic resonance in the heaviside nonlinearity with white noise and arbitrary periodic signal. Phys. Rev. E 53, 5469 (1996)
M.M. Alibegov, Stochastic resonance in threshold systems. Phys. Rev. E 59, 4841 (1999)
N.G. Stocks, Suprathreshold stochastic resonance in multilevel threshold systems. Phys. Rev. Lett. 84, 2310 (2000)
S. Blanchard, D. Rousseau, F. Chapeau-Blondeau, Noise enhancement of signal transduction by parallel arrays of nonlinear neurons with threshold and saturation. Neurocomputing 71, 333 (2007)
R. Benzi, A. Sutera, A. Vulpiani, The mechanism of stochastic resonance. J. Phys. A 14, L453 (1981)
R. Benzi, G. Parisi, A. Sutera, A. Vulpiani, Stochastic resonance in climatic change. Tellus 34, 10 (1982)
B. McNamara, K. Wiesenfeld, Theory of stochastic resonance. Phys. Rev. A 39, 4854 (1989)
P. Jung, P. Hänggi, Amplification of small signals via stochastic resonance. Phys. Rev. A 44, 8032 (1991)
V.I. Melnikov, Schmitt trigger: a solvable model of stochastic resonance. Phys. Rev. E 48, 2481 (1993)
A. Neiman, L. Schimansky-Geier, Stochastic resonance in bistable systems driven by harmonic noise. Phys. Rev. Lett. 72, 2988 (1994)
M.H. Choi, R.F. Fox, P. Jung, Quantifying stochastic resonance in bistable systems: response vs residence-time distribution functions. Phys. Rev. E 57, 6335 (1998)
P. Landa, V. Ushakov, J. Kurths, Rigorous theory of stochastic resonance in overdamped bistable oscillators for weak signals. Chaos Soliton Fract. 30, 574 (2006)
M. Sahoo, S. Saikia, M.C. Mahato, A.M. Jayannavar, Stochastic resonance and heat fluctuations in a driven double-well system. Physica A 387, 6284 (2008)
A. Longtin, Stochastic resonance in neuron models. J. Stat. Phys. 70, 309 (1993)
J. Collins, C. Chow, T. Imhoff, Aperiodic stochastic resonance in excitable systems. Phys. Rev. E 52, R3321 (1995)
H.E. Plesser, T. Geisel, Markov analysis of stochastic resonance in a periodically driven integrate-and-fire neuron. Phys. Rev. E 59, 7008 (1999)
B. Lindner, L. Schimansky-Geier, Coherence and stochastic resonance in a two-state system. Phys. Rev. E 61, 6103 (2000)
J.A. Freund, J. Kienert, L. Schimansky-Geier, B. Beisner, A. Neiman, D.F. Russell, T. Yakusheva, F. Moss, Behavioral stochastic resonance: how a noisy army betrays its outpost. Phys. Rev. E 63, 031910 (2001)
Y. Gai, B. Doiron, J. Rinzel, Slope-based stochastic resonance: how noise enables phasic neurons to encode slow signals. PLoS Comput. Biol. 6, e1000825 (2010)
S. Fauve, F. Heslot, Stochastic resonance in a bistable system. Phys. Lett. A 97, 5 (1983)
B. McNamara, K. Wiesenfeld, R. Roy, Observation of stochastic resonance in a ring laser. Phys. Rev. Lett. 60, 2626 (1988)
S.M. Bezrukov, I. Vodyanoy, Noise-induced enhancement of signal transduction across voltage-dependent ion channels. Nature 378, 362 (1995)
J.K. Douglass, L. Wilkens, E. Pantazelou, F. Moss, Noise enhancement of information transfer in crayfish mechanoreceptor by stochastic resonance. Nature 365, 337 (1993)
J.E. Levin, J.P. Miller, Broadband neural encoding in the cricket cercal sensory system enhanced by stochastic resonance. Nature 380, 165 (1996)
P.E. Greenwood, L.M. Ward, D.F. Russell, A. Neiman, F. Moss, Stochastic resonance enhances the electrosensory information available to paddlefish for prey capture. Phys. Rev. Lett. 84, 4773 (2000)
B.J. Gluckman, T.I. Netoff, E.J. Neel, W.L. Ditto, M.L. Spano, S.J. Schiff, Stochastic resonance in a neuronal network from mammalian brain. Phys. Rev. Lett. 77, 4098 (1996)
I. Hidaka, D. Nozaki, Y. Yamamoto, Functional stochastic resonance in the brain: noise induced sensitization of baroreflex system. Phys. Rev. Lett. 85, 3740 (2000)
G. Schmid, I. Goychuk, P. Hänggi, Effect of channel block on the spiking activity of excitable membranes in a stochastic Hodgkin–Huxley model. Phys. Biol. 1, 61 (2004)
S.D. Burton, G.B. Ermentrout, N.N. Urban, Intrinsic heterogeneity in oscillatory dynamics limits correlation-induced neural synchronization. J. Neurophysiol. 108, 2115 (2012)
P. Krauss, K. Tziridis, C. Metzner, A. Schilling, U. Hoppe, H. Schulze, Stochastic resonance controlled upregulation of internal noise after hearing loss as a putative cause of tinnitus-related neuronal hyperactivity. Front. Neurosci. 10, 597 (2016)
G. Knoll, B. Lindner, Recurrence-mediated suprathreshold stochastic resonance. J. Comput. Neurosci. 49, 407 (2021)
G. Knoll, B. Lindner, Information transmission in recurrent networks: consequences of network noise for synchronous and asynchronous signal encoding. Phys. Rev. E 105, 044411 (2022)
G. Wenning, K. Obermayer, Activity driven adaptive stochastic resonance. Phys. Rev. Lett. 90, 120602 (2003)
P. Krauss, C. Metzner, A. Schilling, C. Schütz, K. Tziridis, B. Fabry, H. Schulze, Adaptive stochastic resonance for unknown and variable input signals. Sci. Rep. 7, 2450 (2017)
F. Li, F. Duan, F. Chapeau-Blondeau, D. Abbott, Signal estimation and filtering from quantized observations via adaptive stochastic resonance. Phys. Rev. E 103, 052108 (2021)
B.W. Knight, Dynamics of encoding in a population of neurons. J. Gen. Physiol. 59, 734 (1972)
W. Gerstner, Population dynamics of spiking neurons: fast transients, asynchronous states, and locking. Neural Comput. 12, 43 (2000)
N. Fourcaud-Trocmé, D. Hansel, C. van Vreeswijk, N. Brunel, How spike generation mechanisms determine the neuronal response to fluctuating inputs. J. Neurosci. 23, 11628 (2003)
A. Mazzoni, S. Panzeri, N.K. Logothetis, N. Brunel, Encoding of naturalistic stimuli by local field potential spectra in networks of excitatory and inhibitory neurons. PLoS Comput. Biol. 4, e1000239 (2008)
W. Gerstner, W.M. Kistler, R. Naud, L. Paninski, Neuronal Dynamics From Single Neurons to Networks and Models of Cognition (Cambridge University Press, Cambridge, 2014)
S.P. Muscinelli, W. Gerstner, T. Schwalger, How single neuron properties shape chaotic dynamics and signal transmission in random neural networks. PLoS Comput. Biol. 15, e1007122 (2019)
J. Henninger, R. Krahe, F. Kirschbaum, J. Grewe, J. Benda, Statistics of natural communication signals observed in the wild identify important yet neglected stimulus regimes in weakly electric fish. J. Neurosci. 38, 5456 (2018)
G. Engler, G. Zupanc, Differential production of chirping behavior evoked by electrical stimulation of the weakly electric fish, apteronotus leptorhynchus. J. Comp. Physiol. A 187, 747 (2001)
M.E. Nelson, Z. Xu, J.R. Payne, Characterization and modeling of P-type electrosensory afferent responses to amplitude modulations in a wave-type electric fish. J. Comp. Physiol. A 181, 532 (1997)
R.D. Vilela, B. Lindner, Are the input parameters of white-noise-driven integrate & fire neurons uniquely determined by rate and CV? J. Theor. Biol. 257, 90 (2009)
D. Bernardi, B. Lindner, Receiver operating characteristic curves for a simple stochastic process that carries a static signal. Phys. Rev. E 101, 062132 (2020)
J.W. Middleton, M.J. Chacron, B. Lindner, A. Longtin, Firing statistics of a neuron model driven by long-range correlated noise. Phys. Rev. E 68, 021920 (2003)
B. Lindner, Superposition of many independent spike trains is generally not a Poisson process. Phys. Rev. E 73, 022901 (2006)
M. Abramowitz, I.A. Stegun, Handbook of Mathematical Functions (Dover, New York, 1970)
N. Fourcaud, N. Brunel, Dynamics of the firing probability of noisy integrate-and-fire neurons. Neural Comput. 14, 2057 (2002)
S. Voronenko, B. Lindner, Nonlinear response of noisy neurons. New J. Phys. 19, 033038 (2017)
A.R. Houweling, M. Brecht, Behavioural report of single neuron stimulation in somatosensory cortex. Nature 451, 65 (2008)
D. Bernardi, G. Doron, M. Brecht, B. Lindner, A network model of the barrel cortex combined with a differentiator detector reproduces features of the behavioral response to single-neuron stimulation. PLoS Comput. Biol. 17, e1007831 (2021)
J. Franzen, L. Ramlow, B. Lindner. The steady state and response to a periodic stimulation of the firing rate for a theta neuron with correlated noise. J. Comput. Neurosci. (Accepted) (2022)
P.S. Landa, P.V.E. McClintock, Vibrational resonance. J. Phys. A Math. Gen. 33, L433 (2000)
E. Ullner, A. Zaikin, J. García-Ojalvo, R. Báscones, J. Kurths, Vibrational resonance and vibrational propagation in excitable systems. Phys. Lett. A 312(5), 348–354 (2003)
J. Grewe, A. Kruscha, B. Lindner, J. Benda, Synchronous spikes are necessary but not sufficient for a synchrony code. PNAS 114, E1977 (2017)
J. Benda, A.V.M. Herz, A universal model for spike-frequency adaptation. Neural Comput. 15, 2523 (2003)
J. Benda, A. Longtin, L. Maler, Spike-frequency adaptation separates transient communication signals from background oscillations. J. Neurosci. 25, 2312 (2005)
Y.H. Liu, X.J. Wang, Spike-frequency adaptation of a generalized leaky integrate-and-fire model neuron. J. Comput. Neurosci. 10, 25 (2001)
M.J. Chacron, A. Longtin, L. Maler, Negative interspike interval correlations increase the neuronal capacity for encoding time-dependent stimuli. J. Neurosci. 21, 5328–5343 (2001)
R. Naud, N. Marcille, C. Clopath, W. Gerstner, Firing patterns in the adaptive exponential integrate-and-fire model. Biol. Cybern. 99, 335 (2008)
T. Schwalger, B. Lindner, Analytical approach to an integrate-and-fire model with spike-triggered adaptation. Phys. Rev. E Stat. Nonlinear Soft Matter Phys. 92, 062703 (2015)
L. Ramlow, B. Lindner, Interspike interval correlations in neuron models with adaptation and correlated noise. PLoS Comput. Biol. 17, e1009261 (2021)
L.M. Ricciardi, Diffusion Processes and Related Topics on Biology (Springer, Berlin, 1977)
B. Lindner, L. Schimansky-Geier, Transmission of noise coded versus additive signals through a neuronal ensemble. Phys. Rev. Lett. 86, 2934–7 (2001)
N. Brunel, F.S. Chance, N. Fourcaud, L.F. Abbott, Effects of synaptic noise and filtering on the frequency response of spiking neurons. Phys. Rev. Lett. 86, 2186 (2001)
B. Lindner, Coherence and Stochastic Resonance in Nonlinear Dynamical Systems (Logos-Verlag, Berlin, 2002)
Acknowledgements
We would like to acknowledge many helpful discussions with Jan Benda, Jan Grewe and Alexandra Barayeu (University of Tübingen, Germany). We are particularly indebted to Alexandra Barayeu for an intense exchange on the detection performance as a function of the two stimulus frequencies.
Funding
Open Access funding enabled and organized by Projekt DEAL. This research was funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) as part of the SPP 2205 - LI 1046/1.
Author information
Authors and Affiliations
Contributions
M.S. and B.L. performed the analytical calculations. M.S. performed the stochastic simulations and the data analysis, supervised by B.L.. All authors contributed equally to the final manuscript.
Corresponding author
Appendix A: Weakly nonlinear response theory
Appendix A: Weakly nonlinear response theory
We briefly state the important measures needed for the applied model (for details see [62]). Note that we here consider the LIF model with white noise for a vanishing refractory period, which simplifies the analytical results given in the following somewhat.
The response of the firing rate r(t) to a weak signal \(\varepsilon s(t)\) \((\varepsilon \ll 1)\) can be approximated by the first terms of a Volterra series
Here, \(r_{0}\) is the spontaneous firing rate, i.e. if no signal is present (\(\varepsilon = 0\)), \(K_{1}\) and \(K_{2}\) are the first-order and second-order response kernels. For the white-noise-driven LIF model, \(r_{0}\) is given by [76]
In the frequency domain, Eq. (A1) reads
where \(\chi _{1}\) and \(\chi _{2}\) are the Fourier transforms of the respective kernels
In terms of the parabolic cylinder functions \({\mathcal {D}}_{k}(x)\) [60], the linear response function \(\chi _{1}\) [77] is given by
with \(\Delta = \left[ v_{R}^{2} - v_{T}^{2} + 2\mu (v_{T} - v_{R})\right] /(4D)\); an equivalent expression in terms of hypergeometric functions was given by Brunel et al. [78] (for the equivalence of the expressions, see [79]). The second-order nonlinear response function \(\chi _{2}\) has been calculated by Voronenko and Lindner [62]:
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Schlungbaum, M., Lindner, B. Detecting a periodic signal by a population of spiking neurons in the weakly nonlinear response regime. Eur. Phys. J. E 46, 108 (2023). https://doi.org/10.1140/epje/s10189-023-00371-x
Received:
Accepted:
Published:
DOI: https://doi.org/10.1140/epje/s10189-023-00371-x