Skip to main content

Flip Flop Neural Networks: Modelling Memory for Efficient Forecasting

  • Conference paper
  • First Online:
  • 949 Accesses

Part of the book series: Lecture Notes in Electrical Engineering ((LNEE,volume 749))

Abstract

Flip flops circuits can memorize information with the help of their bi-stable dynamics. Inspired by the flip flop circuits used in digital electronics, in this work we define a flip flop neuron and construct a neural network endowed with memory. Flip flop neural networks (FFNNs) function like recurrent neural networks (RNNs) and therefore are capable of processing temporal information. To validate FFNNs competency on sequential processing, we solved benchmark time series prediction and classification problems with different domains. Three datasets are used for time series prediction: (1) household power consumption, (2) flight passenger prediction and (3) stock price prediction. As an instance of time series classification, we select indoor movement classification problem. The FFNN performance is compared with RNNs consisting of long short-term memory (LSTM) units. In all the problems, the FFNNs show either show superior or near equal performance compared to LSTM. Flips flops shall also potentially be used for harder sequential problems, like action recognition and video understanding.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   189.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   249.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   249.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Sherstinsky A (2020) Fundamentals of recurrent neural network (RNN) and long short-term memory (LSTM) network. Phys D Nonlinear Phenomena 404:132306

    Google Scholar 

  2. Santhanam S (2020) Context based text-generation using LSTM networks. arXiv preprint arXiv:2005.00048

  3. Wu W et al (2019) Using gated recurrent unit network to forecast short-term load considering impact of electricity price. Energy Procedia 158:3369–3374

    Google Scholar 

  4. Chakrabarty R et al (2018) A novel design of flip-flop circuits using quantum dot cellular automata (QCA). In: 2018 IEEE 8th annual computing and communication workshop and conference (CCWC). IEEE

    Google Scholar 

  5. Pawan Holla F, Chakravarthy S (2016) Decision making with long delays using networks of flip-flop neurons. In: 2016 International joint conference on neural networks (IJCNN), pp 2767–2773

    Google Scholar 

  6. UCI Machine Learning (2016) Household electric power consumption, Version 1, Aug 2016. Retrieved from www.kaggle.com/uciml/electric-power-consumption-data-set/metadata

  7. Andreazzini D (2017) International airline passengers, Version 1, June 2017. Retrieved from www.kaggle.com/andreazzini/international-airline-passengers/metadata

  8. Nandakumar R, Uttamraj KR, Vishal R, Lokeshwari YV (2018) Stock price prediction using long short-term memory. Int Res J Eng Technol (IRJET) 3362–338

    Google Scholar 

  9. Bacciu D, Barsocchi P, Chessa S et al (2014) An experimental characterization of reservoir computing in ambient assisted living applications. Neural Comput Appl 24:1451–1464. https://doi.org/10.1007/s00521-013-1364-4

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to V. Srinivasa Chakravarthy .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Sujith Kumar, S., Vigneswaran, C., Srinivasa Chakravarthy, V. (2021). Flip Flop Neural Networks: Modelling Memory for Efficient Forecasting. In: Gopi, E.S. (eds) Machine Learning, Deep Learning and Computational Intelligence for Wireless Communication. Lecture Notes in Electrical Engineering, vol 749. Springer, Singapore. https://doi.org/10.1007/978-981-16-0289-4_13

Download citation

Publish with us

Policies and ethics