Abstract
Flip flops circuits can memorize information with the help of their bi-stable dynamics. Inspired by the flip flop circuits used in digital electronics, in this work we define a flip flop neuron and construct a neural network endowed with memory. Flip flop neural networks (FFNNs) function like recurrent neural networks (RNNs) and therefore are capable of processing temporal information. To validate FFNNs competency on sequential processing, we solved benchmark time series prediction and classification problems with different domains. Three datasets are used for time series prediction: (1) household power consumption, (2) flight passenger prediction and (3) stock price prediction. As an instance of time series classification, we select indoor movement classification problem. The FFNN performance is compared with RNNs consisting of long short-term memory (LSTM) units. In all the problems, the FFNNs show either show superior or near equal performance compared to LSTM. Flips flops shall also potentially be used for harder sequential problems, like action recognition and video understanding.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Sherstinsky A (2020) Fundamentals of recurrent neural network (RNN) and long short-term memory (LSTM) network. Phys D Nonlinear Phenomena 404:132306
Santhanam S (2020) Context based text-generation using LSTM networks. arXiv preprint arXiv:2005.00048
Wu W et al (2019) Using gated recurrent unit network to forecast short-term load considering impact of electricity price. Energy Procedia 158:3369–3374
Chakrabarty R et al (2018) A novel design of flip-flop circuits using quantum dot cellular automata (QCA). In: 2018 IEEE 8th annual computing and communication workshop and conference (CCWC). IEEE
Pawan Holla F, Chakravarthy S (2016) Decision making with long delays using networks of flip-flop neurons. In: 2016 International joint conference on neural networks (IJCNN), pp 2767–2773
UCI Machine Learning (2016) Household electric power consumption, Version 1, Aug 2016. Retrieved from www.kaggle.com/uciml/electric-power-consumption-data-set/metadata
Andreazzini D (2017) International airline passengers, Version 1, June 2017. Retrieved from www.kaggle.com/andreazzini/international-airline-passengers/metadata
Nandakumar R, Uttamraj KR, Vishal R, Lokeshwari YV (2018) Stock price prediction using long short-term memory. Int Res J Eng Technol (IRJET) 3362–338
Bacciu D, Barsocchi P, Chessa S et al (2014) An experimental characterization of reservoir computing in ambient assisted living applications. Neural Comput Appl 24:1451–1464. https://doi.org/10.1007/s00521-013-1364-4
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Sujith Kumar, S., Vigneswaran, C., Srinivasa Chakravarthy, V. (2021). Flip Flop Neural Networks: Modelling Memory for Efficient Forecasting. In: Gopi, E.S. (eds) Machine Learning, Deep Learning and Computational Intelligence for Wireless Communication. Lecture Notes in Electrical Engineering, vol 749. Springer, Singapore. https://doi.org/10.1007/978-981-16-0289-4_13
Download citation
DOI: https://doi.org/10.1007/978-981-16-0289-4_13
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-16-0288-7
Online ISBN: 978-981-16-0289-4
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)