Skip to main content

Long Short-Term Attention

  • Conference paper
  • First Online:
Advances in Brain Inspired Cognitive Systems (BICS 2019)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11691))

Included in the following conference series:

Abstract

Attention is an important cognition process of humans, which helps humans concentrate on critical information during their perception and learning. However, although many machine learning models can remember information of data, they have no the attention mechanism. For example, the long short-term memory (LSTM) network is able to remember sequential information, but it cannot pay special attention to part of the sequences. In this paper, we present a novel model called long short-term attention (LSTA), which seamlessly integrates the attention mechanism into the inner cell of LSTM. More than processing long short term dependencies, LSTA can focus on important information of the sequences with the attention mechanism. Extensive experiments demonstrate that LSTA outperforms LSTM and related models on the sequence learning tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Posner, M.I.: Cognitive Neuroscience of Attention. Guilford Press (2011)

    Google Scholar 

  2. Xu, K., et al.: Show, attend and tell: neural image caption generation with visual attention. In: ICML, pp. 2048–2057 (2015)

    Google Scholar 

  3. Luo, B., Hussain, A., Mahmud, M., Tang, J.: Advances in brain-inspired cognitive systems. Cogn. Comput. 8(5), 795–796 (2016)

    Google Scholar 

  4. Taylor, J.G.: Cognitive computation. Cogn. Comput. 1(1), 4–16 (2009)

    Article  Google Scholar 

  5. Heinke, D., Backhaus, A.: Modelling visual search with the selective attention for identification model (VS-SAIM): a novel explanation for visual search asymmetries. Cogn. Comput. 3(1), 185–205 (2011)

    Article  Google Scholar 

  6. Aboudib, A., Gripon, V., Coppin, G.: A biologically inspired framework for visual information processing and an application on modeling bottom-up visual attention. Cogn. Comput. 8(6), 1007–1026 (2016)

    Article  Google Scholar 

  7. Gao, F., Zhang, Y., Wang, J., Sun, J., Yang, E., Hussain, A.: Visual attention model based vehicle target detection in synthetic aperture radar images: a novel approach. Cogn. Comput. 7(4), 434–444 (2015)

    Article  Google Scholar 

  8. Wischnewski, M., Belardinelli, A., Schneider, W.X., Steil, J.J.: Where to look next? Combining static and dynamic proto-objects in a TVA-based model of visual attention. Cogn. Comput. 2(4), 326–343 (2010)

    Article  Google Scholar 

  9. Katsuki, F., Constantinidis, C.: Bottom-up and top-down attention: different processes and overlapping neural systems. Neurocomputing 20(5), 509–521 (2014)

    Google Scholar 

  10. Wang, Z., Ren, J., Zhang, D., Sun, M., Jiang, J.: A deep-learning based feature hybrid framework for spatiotemporal saliency detection inside videos. Neurocomputing 287, 68–83 (2018)

    Article  Google Scholar 

  11. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  12. Luong, T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. In: EMNLP, pp. 1412–1421 (2015)

    Google Scholar 

  13. Lin, Z., et al.: A structured self-attentive sentence embedding. In: ICLR (2017)

    Google Scholar 

  14. Greff, K., Srivastava, R.K., Koutník, J., Steunebrink, B.R., Schmidhuber, J.: LSTM: a search space odyssey. IEEE Trans. Neural Netw. Learn. Syst. 28(10), 2222–2232 (2017)

    Google Scholar 

  15. Wöllmer, M., Eyben, F., Graves, A., Schuller, B.W., Rigoll, G.: Bidirectional LSTM networks for context-sensitive keyword detection in a cognitive virtual agent framework. Cogn. Comput. 2(3), 180–190 (2010)

    Article  Google Scholar 

  16. Graves, A., Mohamed, A.-R., Hinton, G.: Speech recognition with deep recurrent neural networks. In: ICASSP, pp. 6645–6649 (2013)

    Google Scholar 

  17. He, Z., Gao, S., Xiao, L., Liu, D., He, H., Barber, D.: Wider and deeper, cheaper and faster: tensorized LSTMs for sequence learning. In: NIPS, pp. 1–11 (2017)

    Google Scholar 

  18. Wang, P., Song, Q., Han, H., Cheng, J.: Sequentially supervised long short-term memory for gesture recognition. Cogn. Comput. 8(5), 982–991 (2016)

    Article  Google Scholar 

  19. Neil, D., Pfeiffer, M., Liu, S.-C.: Phased LSTM: accelerating recurrent network training for long or event-based sequences. In: NIPS, pp. 3882–3890 (2016)

    Google Scholar 

  20. Wang, Y., Long, M., Wang, J., Gao, Z., Philip, S.Y.: PredRNN: recurrent Neural networks for predictive learning using spatiotemporal LSTMs. In: NIPS, pp. 879–888 (2017)

    Google Scholar 

  21. Xingjian, S., Chen, Z., Wang, H., Yeung, D.-Y., Wong, W.-K., Woo, W.-C.: Convolutional LSTM network: a machine learning approach for precipitation nowcasting. In: NIPS, pp. 802–810 (2015)

    Google Scholar 

  22. Corbetta, M., Shulman, G.L.: Control of goal-directed and stimulus-driven attention in the brain. Nat. Rev. Neurosci. 3(3), 201–215 (2002)

    Google Scholar 

  23. Yan, Y., et al.: Unsupervised image saliency detection with gestalt-laws guided optimization and visual attention based refinement. Pattern Recogn. 79, 65–78 (2018)

    Article  Google Scholar 

  24. Cutsuridis, V.: A cognitive model of saliency, attention, and picture scanning. Cogn. Comput. 1(4), 292–299 (2009)

    Article  Google Scholar 

  25. Wichert, A.: The role of attention in the context of associative memory. Cogn. Comput. 3(1), 311–320 (2011)

    Article  Google Scholar 

  26. Kim, Y., Denton, C., Hoang, L., Rush, A.M.: Structured attention networks. In: ICLR (2017)

    Google Scholar 

  27. Vaswani, A., et al.: Attention is all you need. In: NIPS, pp. 6000–6010 (2017)

    Google Scholar 

  28. Sukhbaatar, S., Weston, J., Fergus, R., et al.: End-to-end memory networks. In: NIPS, pp. 2440–2448 (2015)

    Google Scholar 

  29. LeCun, Y., Cortes, C., Burges, C.J.: MNIST Handwritten Digit Database. AT&T Labs, February 2010

    Google Scholar 

  30. Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: a novel image dataset for benchmarking machine learning algorithms. CoRR, abs/1708.07747 (2017)

    Google Scholar 

  31. Chung, J., Gülçehre, Ç., Cho, K., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling. CoRR, abs/1412.3555 (2014)

    Google Scholar 

  32. Graves, A., Schmidhuber, J.: Framewise phoneme classification with bidirectional LSTM and other neural network architectures. Neural Netw. 18(5–6), 602–610 (2005)

    Article  Google Scholar 

  33. Moniz, J.R.A., Krueger, D.: Nested LSTMs. In: ACML, pp. 530–544 (2017)

    Google Scholar 

  34. Maas, A.L., Daly, R.E., Pham, P.T., Huang, D., Ng, A.Y., Potts, C.: Learning word vectors for sentiment analysis. In: ACL-HLT, pp. 142–150, June 2011

    Google Scholar 

  35. Pontiki, M., Galanis, D., Pavlopoulos, J., Papageorgiou, H., Androutsopoulos, I., Manandhar, S.: SemEval-2014 task 4: aspect based sentiment analysis. In: SemEval@COLING, pp. 27–35 (2014)

    Google Scholar 

  36. Dong, L., Wei, F., Tan, C., Tang, D., Zhou, M., Xu, K.: Adaptive recursive neural network for target-dependent Twitter sentiment classification. In: ACL, pp. 49–54 (2014)

    Google Scholar 

  37. Yan, Y., Yin, X.-C., Li, S., Yang, M., Hao, H.-W.: Learning document semantic representation with hybrid deep belief network. Comput. Intell. Neurosci. 2015 650527:1–650527:9 (2015)

    Google Scholar 

  38. Liu, Q., Zhang, H., Zeng, Y., Huang, Z., Wu, Z.: Content attention model for aspect based sentiment analysis. In: WWW, pp. 1023–1032 (2018)

    Google Scholar 

  39. Tang, D., Qin, B., Liu, T.: Aspect level sentiment classification with deep memory network. In: EMNLP, pp. 214–224 (2016)

    Google Scholar 

  40. Wang, Y., Huang, M., Zhu, X., Zhao, L.: Attention-based LSTM for aspect-level sentiment classification. In: EMNLP, pp. 606–615 (2016)

    Google Scholar 

Download references

Acknowledgment

This work was supported by the National Key R&D Program of China under Grant No. 2016YFC1401004, the National Natural Science Foundation of China (NSFC) under Grant No. 41706010, and No. 61876155, the Science and Technology Program of Qingdao under Grant No. 17-3-3-20-nsh, the CERNET Innovation Project under Grant No. NGII20170416, and the Fundamental Research Funds for the Central Universities of China.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Guoqiang Zhong .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhong, G., Lin, X., Chen, K., Li, Q., Huang, K. (2020). Long Short-Term Attention. In: Ren, J., et al. Advances in Brain Inspired Cognitive Systems. BICS 2019. Lecture Notes in Computer Science(), vol 11691. Springer, Cham. https://doi.org/10.1007/978-3-030-39431-8_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-39431-8_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-39430-1

  • Online ISBN: 978-3-030-39431-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics