Skip to main content
Log in

Generating context-free languages using spiking neural P systems with structural plasticity

  • Regular Paper
  • Published:
Journal of Membrane Computing Aims and scope Submit manuscript

Abstract

Spiking neural P system (SNP system) is a model of computation inspired by networks of spiking neurons. An SNP system is a network of neurons that can send an object, known as a spike, to each other. Spiking neural P system with structural plasticity (SNPSP system) is a variant of the classical SNP system. SNPSP system that incorporates the ideas of synaptogenesis (creating new synapses) and synaptic pruning (deletion of existing synapses), collectively known as structural plasticity, as features of the model. This gives SNPSP systems the ability to change their own structure/topology. In this work, we use SNPSP systems to generate context-free languages. We create a procedure for constructing an SNPSP system given a context-free grammar in Greibach normal form (GNF). The resulting SNPSP system essentially simulates the way in which a context-free grammar in GNF is used to generate languages. We use modules known as arithmetic-memory modules, also created using SNPSP systems, to perform arithmetic operations which are needed for the simulation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Alhazov, A., Ciubotaru, C., Ivanov, S., & Rogozhin, Y. (2011). The family of languages generated by non-cooperative membrane systems. In M. Gheorghe, T. Hinze, G. Păun, G. Rozenberg, & A. Salomaa (Eds.), Membrane computing (pp. 65–80). Berlin: Springer.

    Google Scholar 

  2. Cabarle, F. G. C., Adorna, H. N., Pérez-Jiménez, M. J., & Song, T. (2015). Spiking neural P systems with structural plasticity. Neural Computing and Applications, 26(8), 1905–1917.

    Article  MATH  Google Scholar 

  3. Cabarle, F. G. C., de la Cruz, R. T. A., & Zeng, X. Arithmetic and memory modules using spiking neural P systems with structural plasticity. Preproceedings of the 6th Asian Conference on Membrane Computing (ACMC 2017).

  4. Cabarle, F. G. C., de la Cruz, R. T. A., Zhang, X., Jiang, M., Liu, X., & Zeng, X. (2018). On string languages generated by spiking neural P systems with structural plasticity. IEEE Transactions on NanoBioscience. https://doi.org/10.1109/tnb.2018.2879345.

  5. Chen, H., Freund, R., Ionescu, M., Păun, Gh, & Pérez-Jiménez, M. J. (2007). On string languages generated by spiking neural P systems. Fundamenta Informaticae, 75(1–4), 141–162.

    MathSciNet  MATH  Google Scholar 

  6. Chen, H., Ionescu, M., Ishdorj, T. O., Păun, A., Păun, Gh, & Pérez-Jiménez, M. J. (2008). Spiking neural P systems with extended rules: Universality and languages. Natural Computing, 7(2), 147–166.

    Article  MathSciNet  MATH  Google Scholar 

  7. Greibach, S. A. (1965). A new normal-form theorem for context-free phrase structure grammars. Journal of the ACM, 12(1), 42–52. https://doi.org/10.1145/321250.321254.

    Article  MathSciNet  MATH  Google Scholar 

  8. Ionescu, M., Păun, Gh., & Yokomori, T. (2006). Spiking Neural P Systems. Fundam. Inf. 71(2,3), 279–308

  9. Jiang, K., Chen, W., Zhang, Y., & Pan, L. (2016). On string languages generated by sequential spiking neural P systems based on the number of spikes. Natural Computing, 15(1), 87–96. https://doi.org/10.1007/s11047-015-9514-5.

    Article  MathSciNet  MATH  Google Scholar 

  10. Kong, Y., Zhang, Z., & Liu, Y. (2015). On string languages generated by spiking neural P systems with astrocytes. Fundamenta Informaticae, 136(3), 231–240.

    MathSciNet  MATH  Google Scholar 

  11. Krithivasan, K., Metta, V. P., & Garg, D. (2011). On string languages generated by spiking neural P systems with anti-spikes. International Journal of Foundations of Computer Science, 22(01), 15–27.

    Article  MathSciNet  MATH  Google Scholar 

  12. Pan, L., & Păun, G. (2009). Spiking neural P systems with anti-spikes. International Journal of Computers, Communications and Control, 4(3), 273–282.

    Article  Google Scholar 

  13. Pan, L., Păun, G., Zhang, G., & Neri, F. (2017). Spiking neural P systems with communication on request. International Journal of Neural Systems, 27(08), 1750042. https://doi.org/10.1142/s0129065717500423.

    Article  Google Scholar 

  14. Pan, L., Wang, J., & Hoogeboom, H. (2012). Spiking neural P systems with astrocytes. Neural Computation, 24(3), 805–825.

    Article  MathSciNet  MATH  Google Scholar 

  15. Pan, L., Wu, T., Su, Y., & Vasilakos, A. V. (2017). Cell-like spiking neural P systems with request rules. IEEE Transactions on NanoBioscience, 16(6), 513–522. https://doi.org/10.1109/TNB.2017.2722466.

    Article  Google Scholar 

  16. Pan, L., Zeng, X., Zhang, X., & Jiang, Y. (2012). Spiking neural P systems with weighted synapses. Neural Processing Letters, 35(1), 13–27. https://doi.org/10.1007/s11063-011-9201-1.

    Article  Google Scholar 

  17. Păun, G. (2007). Spiking neural P systems with astrocyte-like control. Journal of Universal Computer Science, 13(11), 1707–1721.

    MathSciNet  Google Scholar 

  18. Păun, G. (2000). Computing with membranes. Journal of Computer and System Sciences, 61(1), 108–143.

    Article  MathSciNet  MATH  Google Scholar 

  19. Păun, Gh, Rozenberg, G., & Salomaa, A. (2010). The Oxford handbook of membrane computing. New YorkThe Oxford handbook of membrane computing: Oxford University Press Inc.

    Book  MATH  Google Scholar 

  20. Song, T., & Pan, L. (2015). Spiking neural P systems with rules on synapses working in maximum spikes consumption strategy. IEEE Transactions on Nanobioscience, 14(1), 37–43.

    Google Scholar 

  21. Song, T., Pan, L., & Păun, G. (2014). Spiking neural P systems with rules on synapses. Theoretical Computer Science, 529, 82–95. https://doi.org/10.1016/j.tcs.2014.01.001.

    Article  MathSciNet  MATH  Google Scholar 

  22. Wang, J., Hoogeboom, H., Pan, L., Păun, G., & Pérez-Jiménez, M. (2010). Spiking neural P systems with weights. Neural Computation, 22(10), 2615–2646.

    Article  MathSciNet  MATH  Google Scholar 

  23. Wu, T., Bîlbîe, F. D., Păun, A., Pan, L., & Neri, F. (2018). Simplified and yet Turing universal spiking neural p systems with communication on request. International Journal of Neural Systems, 28(08), 1850013. https://doi.org/10.1142/s0129065718500132.

    Article  Google Scholar 

  24. Wu, T., Păun, A., Zhang, Z., & Pan, L. (2018). Spiking neural P systems with polarizations. IEEE Transactions on Neural Networks and Learning Systems, 29(8), 3349–3360. https://doi.org/10.1109/TNNLS.2017.2726119.

    Article  MathSciNet  Google Scholar 

  25. Wu, T., Zhang, Z., & Pan, L. (2016). On languages generated by cell-like spiking neural P systems. IEEE Transactions on Nanobioscience, 15(5), 455–467.

    Article  Google Scholar 

  26. Wu, T., Zhang, Z., Pun, G., & Pan, L. (2016). Cell-like spiking neural P systems. Theoretical Computer Science, 623(C), 180–189. https://doi.org/10.1016/j.tcs.2015.12.038.

    Article  MathSciNet  MATH  Google Scholar 

  27. Zhang, X., Pan, L., & Păun, A. (2015). On the Universality of Axon P systems. IEEE Transactions on Neural Networks and Learning Systems, 26(11), 2816–2829. https://doi.org/10.1109/TNNLS.2015.2396940.

    Article  MathSciNet  Google Scholar 

  28. Zeng, X., Zhang, X., Song, T., & Pan, L. (2014). Spiking neural P systems with thresholds. Neural Computation, 26(7), 1340–1361.

    Article  MathSciNet  MATH  Google Scholar 

  29. Zhang, X., Zeng, X., & Pan, L. (2008). On string languages generated by spiking neural P systems with exhaustive use of rules. Natural Computing, 7(4), 535–549.

    Article  MathSciNet  MATH  Google Scholar 

  30. Zhang, X., Zeng, X., & Pan, L. (2009). On languages generated by asynchronous spiking neural P systems. Theoretical Computer Science, 410(26), 2478–2488. https://doi.org/10.1016/j.tcs.2008.12.055.

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

R.T.A. de la Cruz is supported by a graduate scholarship from the DOST-ERDT project. F.G. Cabarle thanks the support from the DOST-ERDT project; the Dean Ruben A. Garcia PCA AY2018–2019, and an RLC AY2018–2019 grant of the OVCRD, both from UP Diliman. H. Adorna would like to appreciate and thank the support granted by UPD-OVCRD RCL grant, ERDT Research Program of the College of Engineering, UP Diliman and the Semirara Mining Corporation Professorial Chair for Computer Science. N. Hernandez is supported by the Vea Technology for All professorial chair. The work of X. Zeng was supported by the National Natural Science Foundation of China (Grant Nos. 61472333, 61772441, 61472335, 61672033, 61425002, 61872309, 61771331), Project of marine economic innovation and development in Xiamen (No. 16PFW034SF02), Natural Science Foundation of the Higher Education Institutions of Fujian Province (No. JZ160400).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ren Tristan A. de la Cruz.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

A Appendix

A Appendix

1.1 A.1 String generation—example

Example

\(G = (N=\{A,B\}, T=\{a,b\}, A, P)\) where P contains the following rules: \(R_1: A \rightarrow aAB\), \(R_2: B \rightarrow bAB\), \(R_3: A \rightarrow a\), \(R_4: B \rightarrow b\).

The table below shows how the word “aabab” is generated by the algorithm using grammar G.

In iteration 1, \({\overline{N}}=A\), and the popped top symbol is A which means rules \(R_1, R_3\) can be applied. After popping the top symbol, the stack is now \({\overline{N}}=\lambda\). \(R_1\) is non-deterministically selected and applied. Applying \(R_1\) will output the terminal symbol a and will push the symbol B, and then symbol A to the stack which will result in \({\overline{N}}=BA\). The rest of the iterations are shown above. Each iteration performs the Steps 1 and 2 of the algorithm. For each iteration, there is an output terminal symbol. Looking at the ‘Output Symbol’ column in Table 1, you can see that the word “aabab” is generated by the algorithm.

Table 1 String production for aabab

1.2 A.2 Encoding of the stack string—example

\(N = \{n_1=A, n_2=B, n_3=C\}\), \(x=|N|=3\), \(\mathrm{{val}}_3(n_i)=i\) for \(1 \le i \le 3\).

1.3 A.3 Arithmetic version of push operation (or string concatenation)—example

We can use strings in Table 2. The size of the alphabet N is \(x=3\), string ABC is encoded as \(123_4\) (in base \(x+1=4\)) and string AA is encoded as \(11_4\). If we let \({\overline{N}} = ABC\) and \(N'=AA\), then \({\overline{N}}N'=ABCAA\). If we encode \({\overline{N}}N'=ABCAA\) as described in Eq. 2, then (note: \(x+1=4_{10}\) or \(x+1=10_4\))

$$\begin{aligned} \mathrm{{val}}_3(ABCAA)_{10} \, = \, & {} \mathrm{{val}}_3(A)\cdot 4^4 + \mathrm{{val}}_3(B)\cdot 4^3 + \mathrm{{val}}_3(C)\cdot 4^2 + \mathrm{{val}}_3(A)\cdot 4^1 + \mathrm{{val}}_3(A)\cdot 4^0 \\ \mathrm{{val}}_3(ABCAA)_{10}= & {} 1\cdot 4^4 + 2\cdot 4^3 + 3\cdot 4^2 + 1\cdot 4^1 +1\cdot 4^0 = 437_{10} \end{aligned}$$

The encoding \(\mathrm{{val}}_3(ABCAA)_{10}=437_{10}\) is calculated using base 10. In base \(x+1=4\), it is written as \(12311_4\). It is easier to visualize how the encoding works when the result is written in base \(x+1\), since for every symbol in the string, there is a corresponding ‘digit’ in the encoding when written in base \(x+1\). In base \(x+1=4\), the resulting encoding \(\mathrm{{val}}_3(ABCAA)\) is the concatenation of the digits of \(\mathrm{{val}}_3(ABC)=123_4\) and digits of \(\mathrm{{val}}_3(AA)=11_4\) which is \(12311_4\). But since concatenation is an operation on strings and the encodings are numbers, we use Eq. 7 for \(p'_1\) to define arithmetically this digit concatenation process. Using \(p'_1\), \(\mathrm{{val}}_3(ABCAA)\) can be calculated as \(p'_1(\mathrm{{val}}_3(ABC), \mathrm{{val}}_3(AA))\):

$$\begin{aligned} p'_1(\mathrm{{val}}_3(ABC), \mathrm{{val}}_3(AA)) \,= \,& {} \mathrm{{val}}_3(ABC)(10_4)^{|AA|} + \mathrm{{val}}_x(AA) \\ p'_1(\mathrm{{val}}_3(ABC), \mathrm{{val}}_3(AA))= & {} 123_4 \cdot (10_4)^{2_4} + 11_4 = 12300_4 + 11_4 = 12311_4 \end{aligned}$$
Table 2 String encoding examples

1.4 A.4 String generation using arithmetic version of stack operations—example

Let \(G = (N=\{A,B,C\}, T=\{a,b,c\}, A, P)\) where P contains the following rules: \(R_1: A \rightarrow aABB\), \(R_2: B \rightarrow bBBC\), \(R_3: A \rightarrow a\), \(R_4: B \rightarrow b\), \(R_5: C \rightarrow c\).

The string aabbbcb is generated by grammar G as follows:

\(A \xrightarrow []{R_1} aABB \xrightarrow []{R_3} aaBB \xrightarrow []{R_2} aabBBCB \xrightarrow []{R_4} aabbBCB \xrightarrow []{R_4} aabbbCB \xrightarrow []{R_5} aabbbcB \xrightarrow []{R_4} aabbbcb\)

Starting form the axiom symbol A, one applicable rule is non-deterministically selected and applied to the first (left-most) non-terminal symbol. One rule is applied at any given time. Since grammar G is in Greibach normal form, when a rule is applied, only one terminal symbol is generated at a time. The string is generated from left to right.

We note a difference in conventions used in the string generation above and the string generation using the stack algorithm. In string generation above, for example in the step: \(aaBB \xrightarrow []{R_2} aabBBCB\), the terminal symbols are generated from left to right, applying the rules first on the left-most non-terminal symbol which in this step is B. When \(R_2\) is applied to B, terminal symbol b is generated and non-terminal symbols BBC are generated. In the stack algorithm, stack string \({\overline{N}}\) stores the non-terminal symbols. In this step, \({\overline{N}}=BB\) but the top of the stack is the right-most symbol of \({\overline{N}}\). When \(R_2\) is applied, \({\overline{N}}=B\) since the top symbol B was popped, and the string CBB is pushed to the stack making the stack string \({\overline{N}}=BCBB\). The stack algorithm also generates the string from left to right similar to the normal derivation, the only difference is on the convention of how the stack string is written. The left-most non-terminal symbol in the generated string corresponds to the top (right-most) symbol of the stack.

This convention in writing the stack string is used when the stack string is encoded as a number the top symbol of the stack will be the least significant digit of the encoded number, i.e., stack string \({\overline{N}}=ABCABC\), \(val_3(ABCABC)=123123_4\), the top of symbol of the stack is C, and the top symbol encoding is the least significant digit \(3_4\). The least significant digit of \(val_x({\overline{N}})\) is easier to calculate compared to its most significant digit.

Table 3 shows the seven iterations of the algorithm when generating the string aabbbcb. The third column shows the encoding \(\mathrm{{val}}_3({\overline{N}})\) in base \(x+1=4\) of the stack string \({\overline{N}}\). The value of \(p'_2(\mathrm{{val}}_3({\overline{N}}))\) is the least significant digit of \(val_3({\overline{N}})\) which corresponds to the top symbol of the stack. The value of \(p'_3(\mathrm{{val}}_3({\overline{N}}))\) is the encoding of the new stack after removing the top symbol of the stack. \(N'\) is the reverse string of non-terminal symbols of a production rule, e.g., For \(R_1: A \rightarrow aABB\), \(N'=BBA\) and for \(R_2: B \rightarrow bBBC\), \(N'=CBB\). Noting again, \(N'\) is the reverse of the actual non-terminal substring because of the convention used in the stack algorithm (right-most symbols being first to be processed, top of the stack). \(\mathrm{{val}}_3(N')\) is the encoding of \(N'\). \(\mathrm{{val}}_3(N')\) is what is pushed, using \(p'_1\) to the new stack \(p'_3(\mathrm{{val}}_3({\overline{N}}))\). The resulting stack is \(p'_1( p'_3(\mathrm{{val}}_3({\overline{N}})),\mathrm{{val}}_3(N'))\). Column 6 shows the generated terminal symbol per iteration.

Table 3 String generation for aabbbcb

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

de la Cruz, R.T.A., Cabarle, F.G. & Adorna, H.N. Generating context-free languages using spiking neural P systems with structural plasticity. J Membr Comput 1, 161–177 (2019). https://doi.org/10.1007/s41965-019-00021-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s41965-019-00021-2

Keywords

Navigation