Goertzel B. Artificial general intelligence: concept, state of the art, and future prospects. J Artif General Intell, 2014, 5: 1–48
Article
Google Scholar
Pei J, Deng L, Song S, et al. Towards artificial general intelligence with hybrid Tianjic chip architecture. Nature, 2019, 572: 106–111
Article
Google Scholar
Graves A, Wayne G, Danihelka I. Neural turing machines. 2014. ArXiv:1410.5401
Sutton R S, Barto A G. Reinforcement Learning: An Introduction. Cambridge: MIT Press, 2018
MATH
Google Scholar
Pierrot T, Ligner G, Reed S, et al. Learning compositional neural programs with recursive tree search and planning. 2019. ArXiv:1905.12941
Liang D C, Indiveri G. A neuromorphic computational primitive for robust context-dependent decision making and context-dependent stochastic computation. IEEE Trans Circ Syst II, 2019, 66: 843–847
Google Scholar
Liang D C, Indiveri G. Robust state-dependent computation in neuromorphic electronic systems. In: Proceedings of IEEE Biomedical Circuits and Systems Conference (BioCAS), 2017
Rutishauser U, Douglas R J. State-dependent computation using coupled recurrent networks. Neural Comput, 2009, 21: 478–509
MathSciNet
Article
Google Scholar
Neftci E, Binas J, Rutishauser U, et al. Synthesizing cognition in neuromorphic electronic systems. Proc Natl Acad Sci USA, 2013, 110: 3468–3476
Article
Google Scholar
Graves A, Wayne G, Reynolds M, et al. Hybrid computing using a neural network with dynamic external memory. Nature, 2016, 538: 471–476
Article
Google Scholar
Kaiser L, Gomez A N, Shazeer N, et al. One model to learn them all. 2017. ArXiv:1706.05137
Giles C L, Miller C B, Chen D, et al. Learning and extracting finite state automata with second-order recurrent neural networks. Neural Comput, 1992, 4: 393–405
Article
Google Scholar
Arai K, Nakano R. Stable behavior in a recurrent neural network for a finite state machine. Neural Netw, 2000, 13: 667–680
Article
Google Scholar
Wennekers T. Synfire graphs: from spike patterns to automata of spiking neurons. 2013. https://oparu.uni-ulm.de/xmlui/handle/123456789/2524
Wennekers T, Ay N. Finite state automata resulting from temporal information maximization and a temporal learning rule. Neural Comput, 2005, 17: 2258–2290
MathSciNet
Article
Google Scholar
Clarke D A, Minsky M L. Computation: finite and infinite machines. Am Math Mon, 1968, 75: 428
MathSciNet
Article
Google Scholar
Horne B G, Hush D R. Bounds on the complexity of recurrent neural network implementations of finite state machines. Neural Netw, 1996, 9: 243–252
Article
Google Scholar
Forcada M L, Carrasco R C. Finite-state computation in analog neural networks: steps towards biologically plausible models? In: Emergent Neural Computational Architectures, LNAI 2036, 2001. 480–493
Article
Google Scholar
Tvardovskii A S, Vinarskii E M, Yevtushenko N V. Experimental evaluation of timed finite state machine based test derivation. In: Proceedings of the 20th International Conference of Young Specialists on Micro/Nanotechnologies and Electron Devices (EDM), 2019. 102–107
Laputenko A V. Logic circuit based test derivation for microcontrollers. In: Proceedings of the 20th International Conference of Young Specialists on Micro/Nanotechnologies and Electron Devices (EDM), 2019. 70–73
Mavridou A, Laszka A. Designing secure ethereum smart contracts: a finite state machine based approach. In: Proceedings of International Conference on Financial Cryptography and Data Security, 2018. 523–540
Le L H, Bezerra C E, Pedone F. Dynamic scalable state machine replication. In: Proceedings of the 46th Annual IEEE/IFIP International Conference on Dependable Systems and Networks (DSN), 2016. 13–24
Wang J, Song J W, Chen M Q, et al. Road network extraction: a neural-dynamic framework based on deep learning and a finite state machine. Int J Remote Sens, 2015, 36: 3144–3169
Article
Google Scholar
Said W, Quante J, Koschke R. Towards interactive mining of understandable state machine models from embedded software. In: Proceedings of International Conference on Model-driven Engineering & Software Development, 2018. 117–128
Chen M Z, Saad W, Yin C C. Liquid state machine learning for resource and cache management in LTE-U unmanned aerial vehicle (UAV) networks. IEEE Trans Wirel Commun, 2019, 18: 1504–1517
Article
Google Scholar
Zhang Y, Li P, Jin Y, et al. A digital liquid state machine with biologically inspired learning and its application to speech recognition. IEEE Trans Neural Netw Learn Syst, 2015, 26: 2635–2649
MathSciNet
Article
Google Scholar
Smith M R, Hill A J, Carlson K D, et al. A novel digital neuromorphic architecture efficiently facilitating complex synaptic response functions applied to liquid state machines. In: Proceedings of International Joint Conference on Neural Networks (IJCNN), 2017. 2421–2428
Ghasemiyeh R, Moghdani R, Sana S S. A hybrid artificial neural network with metaheuristic algorithms for predicting stock price. Cybern Syst, 2017, 48: 365–392
Article
Google Scholar
Hudson D, Manning C D. Learning by abstraction: the neural state machine. 2019. ArXiv:1907.03950
Takami M A, Sheikh R, Sana S S. Product portfolio optimisation using teaching-learning-based optimisation algorithm: a new approach in supply chain management. Int J Syst Sci, 2016, 3: 236–246
Google Scholar
Ameri Z, Sana S S, Sheikh R. Self-assessment of parallel network systems with intuitionistic fuzzy data: a case study. Soft Comput, 2019, 23: 12821–12832
Article
Google Scholar
Deng L, Wu Y J, Hu X, et al. Rethinking the performance comparison between SNNS and ANNS. Neural Netw, 2020, 121: 294–307
Article
Google Scholar
Beers S R, Rosenberg D R, Dick E L, et al. Neuropsychological study of frontal lobe function in psychotropic-naive children with obsessive-compulsive disorder. Am J Psychiat, 1999, 156: 777–779
Google Scholar
Mayer H, Perkins D. Towers of Hanoi revisited a nonrecursive surprise. Sigplan Not, 1984, 19: 80–84
Article
Google Scholar
Gonzalez W G, Zhang H, Harutyunyan A, et al. Persistence of neuronal representations through time and damage in the hippocampus. Science, 2019, 365: 821–825
Article
Google Scholar
Deng B L, Li G, Han S, et al. Model compression and hardware acceleration for neural networks: a comprehensive survey. Proc IEEE, 2020, 108: 485–532
Article
Google Scholar