# Random number generation system improving simulations of stochastic models of neural cells

- 981 Downloads
- 4 Citations

## Abstract

The purpose of this work is to speed up simulations of neural tissues based on the stochastic version of the Hodgkin–Huxley model. Authors achieve that by introducing the system providing random values with desired distribution in simulation process. System consists of two parts. The first one is a high entropy fast parallel random number generator consisting of a hardware true random number generator and graphics processing unit implementation of pseudorandom generation algorithm. The second part of the system is Gaussian distribution approximation algorithm based on a set of generators of uniform distribution. Authors present hardware implementation details of the system, test results of the mentioned parts separately and of the whole system in neural cell simulation task.

### Keywords

Random number generator Gaussian distribution Uniform distribution Hardware implementation Hodgkin–Huxley model Graphics processing unit GPU### Mathematics Subject Classification

65C10 (Numerical analysis - Probabilistic methods, simulation and stochastic differential equations - Random number generation)## 1 Introduction

Science since the very beginning tries to understand processes that take place in the nature. One of the methods to truly understand surrounding world is simulation. Unfortunately complexity of the natural processes entails an elongation of the simulations time and dramatically increase the need of computing power. This paper focuses on these problems, and propose solutions that speeds up simulations of natural processes and simplifies computations in some areas of it. We concentrate on simulations of neural cells based on the Hodgkin–Huxley model [6], especially on a stochastic version of it [9].

Simulations of biological processes (e.g. an electric potential flow on a neural cell membrane) often requires large amount of random values [5] with specified distribution. Quality of the projection of natural stochastic processes in simulation environment forces use of high entropy random generators. Increasing the generation speed of random values often leads to decrease its randomness. Building the high speed random generator without losing its level of randomness can be a milestone in the biological processes simulations.

Recently an often approach in simulation tasks is to use GPU implementation. The use of the well known very fast parallel pseudorandom generators becomes natural in this solution. Higher level of randomness demands the necessity of use of more complex, and hence slower, algorithms like Mersenne-Twister. Main disadvantage of the mentioned solutions is its possibility to generate only a pseudorandom values. In the natural processes, which we are trying to simulate, there is no such thing as pseudo randomness.

On the other hand there is a number of truly random hardware generators available on the market. These generators are based on such phenomena as temperature flow, space radiation, flashing of the pulsar, etc. The main disadvantage of these solutions is the generation speed, and often very slow random data transfer rate.

In this paper we introduce solution that combines advantages of mentioned families of the random generators.

## 2 Hodgkin–Huxley neural cell model

### 2.1 Stochastic version of the model

## 3 Random number generation system overview

Our proposal of random number generation system combines solution based on fast parallel pseudorandom algorithm and true random generator with high entropy. Advantages of one neutralize disadvantages of the other one.

The main idea of our system is a fast parallel pseudorandom generator with periodically replaced seed. New seed is taken from a true random number generator, and swaps old seed of parallel pseudorandom generator just before it hits its period. Our system can work with any generator—in our test implementation we use generator based on inverter rings [13, 14] implemented in a FPGA. Parallel pseudorandom algorithm was implemented with the NVIDIA CUDA C programming language and runs on the GPU [2].

### 3.1 Implementation

The true random generator is connected to a PC class computer through the data link. The software application controls work of the parallel GPU algorithm and reads data from the true random number generator. Just before the GPU pseudorandom generator hits its period, the control application swaps its seed. Details about the swapping algorithm are discussed in the following subsection.

Some works show that implementation of a pseudorandom generator in the FPGA can be more efficient [15]. However for our purpose—the simulations of neural cells with use of GPUs—this way of implementation requires transmission of a large amounts of the random data between FPGA and a graphic card through the PCI bus which is very slow [2].

#### 3.1.1 Seed replace algorithm

A generation thread reads the seed at the beginning of each generation loop and it is the only critical section of the algorithm. It must be assured that the seed data stays consistent while it is being read by the generation thread.

## 4 Approximation of the Gaussian distribution algorithm overview

Second part of the solution presented in this paper focuses on the approximation of Gaussian distribution algorithm. It is based on the set of uniform distribution generators. Output value is calculated as a mean value of draws from the set of uniform generators. Detailed description of mathematics and implementation is depicted in following subsections.

### 4.1 Mathematical background

### 4.2 Implementation

## 5 Tests of the sample implementations

This section presents tests of the sample implementations of the system. Next subsection shows the results of a statistical tests of the generation system with a periodically swapped seed, while the second one focuses on the quality tests of the approximation algorithm. The last subsection presents usage of the sample implementation of the complete generating system (consisting of a both mentioned parts) in a neural cells simulation task.

### 5.1 Random generator system

Quality of a random data generated with our system was rated with statistical tests from packets: DIEHARD [7] and SP800-22 [8]. For the test purpose we choose a parallel GPU implementation of the Xorshift with swapped seeds taken from the true random number generator based on inverter rings [13, 14] implemented in Xilinx Virtex 5 FPGA. The tests were run five times, and for every run a new set of random data was generated. The results depicted in the next part of this subsection are mean results of all five runs of a test.

### 5.2 Approximation algorithm

#### 5.2.1 \(\chi ^2\) test of goodness

This part of the paper we have devoted for the considerations on \(\chi ^2\) test of goodness, as the most popular and commonly used test. This test can be any statistical test, which statistics has the \(\chi ^2\) distribution, when founded null hypothesis is true. The \(\chi ^2\) test, by putting the null hypothesis, gives the possibility to verify the normality of tested data.

Probabilities

No. of attempt | Matlab normrnd | Box-Mullersinus | Box-Mullercosinus | Our generator | ||
---|---|---|---|---|---|---|

\(N=2\) | \(N=350\) | \(N=2{,}000\) | ||||

1 | 0.0674 | 0.7297 | 0.5024 | 0 | 0.3867 | 0.0387 |

2 | 0.7060 | 0.3838 | 0.2947 | 0 | 0.1958 | 0.9801 |

3 | 0.0688 | 0.4028 | 0.3683 | 0 | 0.0005 | 0.8320 |

4 | 0.7121 | 0.7687 | 0.7022 | 0 | 0.4862 | 0.1109 |

5 | 0.4274 | 0.6356 | 0.2888 | 0 | 0.0020 | 0.9337 |

6 | 0.7767 | 0.6701 | 0.3087 | 0 | 0.1869 | 0.9285 |

7 | 0.5033 | 0.3105 | 0.9662 | 0 | 0.0028 | 0.7321 |

8 | 0.7854 | 0.1348 | 0.0350 | 0 | 0.0021 | 0.4289 |

9 | 0.9566 | 0.0501 | 0.4856 | 0 | 0.2353 | 0.6371 |

10 | 0.7536 | 0.8635 | 0.2280 | 0 | 0.0041 | 0.9003 |

Mean value | 0.5757 | 0.4950 | 0.4180 | 0 | 0.1502 | 0.6522 |

Probabilities gained in Table 1 show how the results varied from each other. We have tested the built in MATLAB environment generator normrnd and commonly used Box-Muller generator [1], with both—sine and cosine—options, in comparison with our generator, for different number of uniform component generators (\(N = 2,\) \(350,\) \(2~000\)). For each of tested generators, we performed \(10\) independent attempts, each time we counted the mean value of gained data. The results show that, for small number \(N\) of uniform generators the mean value of probability is very small, which makes us to reject our assumption of the normality of our data. It is also easy to notice, that for higher number of generators the mean value of probability is growing, and for \(N=2{,}000\) this mean value is greater than for commonly used Box-Muller generator, what gives the assumption that for even higher number of uniform generators we will be able to obtain the probability value close to 1.

In the next section, we will also show histograms for chosen numbers of generators, where it will be possible to compare data from our generator with the real Gaussian distribution curve.

#### 5.2.2 Histograms

Probabilities

No. of attempt | Matlab normrnd | Box-Muller sinus | Box-Mullercosinus | Our generator | ||
---|---|---|---|---|---|---|

\(N=3\) | \(N=350\) | \(N=2{,}000\) | ||||

1 | 0.0155 | 0.0172 | 0.0169 | 0.4332 | 0.0179 | 0.0183 |

2 | 0.0165 | 0.0162 | 0.0175 | 0.4328 | 0.0174 | 0.0179 |

3 | 0.0155 | 0.0168 | 0.0171 | 0.4339 | 0.0190 | 0.0173 |

4 | 0.0165 | 0.0163 | 0.0175 | 0.4345 | 0.0173 | 0.0174 |

5 | 0.0170 | 0.0165 | 0.0170 | 0.4327 | 0.0184 | 0.0183 |

6 | 0.0163 | 0.0172 | 0.0174 | 0.4332 | 0.0177 | 0.0176 |

7 | 0.0166 | 0.0165 | 0.0167 | 0.4332 | 0.0190 | 0.0183 |

8 | 0.0160 | 0.0175 | 0.0163 | 0.4315 | 0.0193 | 0.0181 |

9 | 0.0188 | 0.0198 | 0.0176 | 0.4340 | 0.0185 | 0.0180 |

10 | 0.0166 | 0.0186 | 0.0180 | 0.4326 | 0.0177 | 0.0180 |

Mean value | 0.0165 | 0.0173 | 0.0172 | 0.4332 | 0.0182 | 0.0179 |

Data collected from our generator had to be normalized. It was performed by dividing the values of bars by the area beneath the curve designated by the histogram. New values of bars are used to determine the error between the histogram and the Gaussian curve. This error is counted as a sum of differences between the value of each bar and the corresponding value of Gaussian distribution. Either the histograms (Fig. 7) or the errors (Table 2), confirms that with properly picked number of component uniform generators, our generator does not deviate from results obtained with MATLAB normrnd or Box-Muller generators.

The huge advantage of our generator is the possibility of parallelization of calculations and thus speed up of generation of huge number of data.

### 5.3 Combined system in a neural cell simulation task

Last part of the tests was, in our point of view, the most important. It relies on use of our system in neural cell simulation. The model of the cell was based on Hodgkin–Huxley stochastic model. We used combined system of the parallel pseudorandom generator with swapped seed and approximation of Gaussian distribution algorithm. We evaluate shape of action potential curve of a simulated cell. Input current and internal cell parameters of all tests were identical and do not influence on the results. Test model was build with Xorshift parallel random generator, true random generator based on inverter rings implemented in FPGA and approximation of Gaussian distribution algorithm with varying number of uniform generators.

## 6 Summary and conclusion

Test show that simple and fast random number generator properties can be improved by use of the system. In our example parallel implementation of Xorshift pseudorandom generator pass very restrictive statistical tests. This solution combines profits of high entropy hardware random generators with speed of parallel pseudorandom algorithms.

Approximation of Gaussian distribution algorithm is fast alternative to commonly used Box–Muller transformation. Our test showed that quality of generated random stream is accurate and could be used in various areas.

The last test proved usability of proposed system in neural cell simulations based on stochastic version of Hodgkin–Huxley model. Using only eight uniform distribution generators we were able to produce output signal as good as model.

Concluding, proposed system proves its usability and functionality in neural cells simulation tasks. Using this solution could give benefits of faster simulations without loosing accuracy.

### References

- 1.Box GEP, Muller ME (1958) A note on the generation of random normal deviates. Ann Math Stat 29(2):610–611MATHCrossRefGoogle Scholar
- 2.NVidia corp (2012) NVIDIA CUDA C Programing Guide version 4.0. NVidia.Google Scholar
- 3.Destexhe A, Mainen ZF, Sejnowski TJ (1994) Synthesis of models for excitable membranes, synaptic transmission and neuromodulation using a common kinetic formalism. J Comput Neurosci 3(1): 195–230Google Scholar
- 4.William F (1968) An introduction to probability theory and its applications vol 1, chapter VII. Wiley, New YorkGoogle Scholar
- 5.Gugała K, Swietlicka A, Jurkowlaniec A, Rybarczyk A (2011) Parallel simulation of stochastic denritic neurons using nvidia gpus with cuda c. Elektronika konstrukcje technologie zastosowania 12(2011): 59–61Google Scholar
- 6.Hodgkin AL, Huxley AF (1952) A quantitative description of membrane current and its application to conduction and excitation in nerve. J Physiol 117:500–544Google Scholar
- 7.Marsaglia G (1995) The marsaglia random number cdrom (online).Google Scholar
- 8.Rukhin A, Soto J, Nechvatal J, Smid M, Barker E, Leigh S, Levenson M, Banks D, Vangel M, Heckert A, Dray J, Vo S (2010) A statistical test suite for random and pseudorandom number generators for cryptographic applications. 800–22 (Revision 1a).Google Scholar
- 9.Saarinen A, Linne M-L (2006) Modeling single neuron behavior using stochastic differential equations. Neurocomputing 69:1091–1096CrossRefGoogle Scholar
- 10.Schneidman E, Freedman B, Segev I (1998) Ion channel stochasticity may be critical in determining the reliability and precision of spike timing. Neural Comput 10(7):1679–1703CrossRefGoogle Scholar
- 11.Scott DW (2009) Struges’ rule. Wiley interdisciplinary Reviews: computational statistics, vol 1. pp 303–306Google Scholar
- 12.Struges HA (1962) The choice of a class interval. J. Am Stat Assoc 65–66.Google Scholar
- 13.Stinson DR, Sunar B, Martin WJ (2007) A provably secure true random generator with build-in tolerance to active attacks. IEEE Transact 56(1):109–119MathSciNetGoogle Scholar
- 14.Tan CH, Wold K (2009) Analysis and enchancement of random number generator in FPGA based on oscillator rings. Int J Reconfig Comput (article ID 501672).Google Scholar
- 15.Tian X, Benkrid K (2009) Mersenne twister random number generation on FPGA, CPU and GPU. Adaptive Hardware and Systems, 2009. AHS 2009. NASA/ESA Conference on, pp 460–464.Google Scholar

## Copyright information

**Open Access**This article is distributed under the terms of the Creative Commons Attribution License which permits any use, distribution, and reproduction in any medium, provided the original author(s) and the source are credited.