Skip to main content

An Attention Arousal Space for Mapping Twitter Data

  • Conference paper
  • First Online:
Innovations in Electrical and Electronic Engineering

Part of the book series: Lecture Notes in Electrical Engineering ((LNEE,volume 661))

  • 1312 Accesses

Abstract

Not every trending tweet commands your attention! Overflow of social media data will soon require mechanisms to identify the severity of a situation to evaluate if a tweet must be given ample attention. News about bomb blasts, flood, epidemic outbreak affects several sectors and impacts the local economy. Such news is covered by media and government sources which forms meta-information with tweets. Most of the algorithms, both traditional and deep learning can decipher if the opinion is positive, negative, or neutral, they cannot allocate attention to the opinion. Compared to the binary classification of positive and negative sentiments, this article proposes a glove-text-CNN algorithm and attaches meta-information to predict the attention accuracy of sentence-level data. Correlation is being established among the various parameters of a trending hashtag and the most important parameter is being reported. The primary contribution is a modified neural algorithm to define an Attention Arousal Space that is designed to capture five major insights of sentence-level documents by attaching meta-information, to include media score, time decay, retweet, favorite scores and government source to modify original algorithm. The sentiment classification algorithm used for this research is Text-CNN with glove word embeddings, which was trained on 2,000,000 sentiment marked tweets from SemEval datasets. Sentiment classification training accuracy was 98.6% and validation set accuracy was 87.13%. Our experiments show accuracy over Hierarchical Attention Networks (HAN) which proves the effectiveness of this modified algorithm with meta-information for capturing attention.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Akhtar M, Kumar A, Ghosal D, Ekbal A, Bhattacharyya P (2017) A multilayer perceptron based ensemble technique for fine-grained financial sentiment analysis. In: Proceedings of the 2017 conference on empirical methods in natural language processing, pp 540–546

    Google Scholar 

  2. Cho K, Merrienboer B, Gulcehre C, Bahdanau D, Bougares F, Schwenk H, Bengio Y (2014) Learning phrase representations using rnn encoder-decoder for statistical machine translation pages. In: Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP). Association for Computational Linguistics, Doha, Qatar, pp 1724–1734

    Google Scholar 

  3. Conneau A, Schwenk H, Barrault L, Lecun Y (2017) Very deep convolutional networks for text classification. In: ACL-EACL, pp 1107–1116

    Google Scholar 

  4. Dos Santos CN, Gatti M (2014) Deep convolutional neural networks for sentiment analysis of short texts. In: COLING, pp 69–78

    Google Scholar 

  5. Gao S, Ramanathan A, Tourassi G (2018) Hierarchical convolutional attention networks for text classification. In: Proceedings of the 3rd workshop on representation learning for NLP. Association for Computational Linguistics, Melbourne, Australia, pp 11–23

    Google Scholar 

  6. Guan Z, Chen L, Zhao W, Zheng Y, Tan S, Cai D (2016) Weakly-supervised deep learning for customer review sentiment classification. In: Proceedings of the twenty-fifth international joint conference on artificial intelligence

    Google Scholar 

  7. Guggilla C, Miller T, Gurevych I (2016) CNN- and LSTM-based claim classification in online user comments. In: Proceedings of COLING 2016, the 26th international conference on computational linguistics: technical papers, pp 2740–2751

    Google Scholar 

  8. Huang M, Qian Q, Zhu X (2017) Encoding syntactic knowledge in neural networks for sentiment classification. ACM Trans Inf Syst (TOIS) 35(3)

    Google Scholar 

  9. Irsoy O, Cardie C (2014) Opinion mining with deep recurrent neural networks. In: EMNLP, pp 720–728

    Google Scholar 

  10. Kalchbrenner N, Grefenstette E, Blunsom P (2014) A convolutional neural network for modelling sentences. In: Annual meeting of the association for computational linguistics. Maryland

    Google Scholar 

  11. Kim Y (2014) Convolutional neural networks for sentence classification. Retrieved from https://arxiv.org/abs/1408.5882

  12. Liu P, Qui X, Huang X (2016) Recurrent neural network for text classification with multi-task learning. In: IJCAI, pp 2873–2879

    Google Scholar 

  13. McCormick TH, Lee H, Cesare N, Shojaie A, Spiro ES (2015) Using twitter for demographic and social science research: tools for data collection and processing. Sage

    Google Scholar 

  14. Mikolov T, Sutskever I, Chen K, Corrado GS, Dean J (2013) Distributed representations of words and phrases and their compositionality. In: Advances in neural information processing systems 26 (NIPS 2013)

    Google Scholar 

  15. Moraes R, Valiati JF, Neto WG (2013) Document-level sentiment classification: An empirical comparison between SVM and ANN. Expert Syst Appl 40(2):621–633

    Google Scholar 

  16. Peirson VAL, Tolunay EM (2018) Dank learning: generating memes using deep neural networks. Retrieved from https://arxiv.org/pdf/1806.04510.pdf

  17. Qian Q, Huang M, Lei J, Zhu X (2017) Linguistically regularized LSTMs for sentiment classification. Retrieved from https://arxiv.org/abs/1611.03949

  18. Rajput D, Madhukar M, Verma S, Manisha (2015) Sentiment analysis on big data using machine learning for holiday destinations 2015. In: IEEE European modelling symposium. Madrid

    Google Scholar 

  19. Santos CN, Gatti M (2014) Deep convolutional neural networks for sentiment analysis of short texts. In: Proceedings of COLING 2014, the 25th international conference on computational linguistics: Technical Papers, pp 69–78

    Google Scholar 

  20. Sukhbaatar S, Szlam A, Weston J, Fergus R (2015) End-to-end memory networks. In: Advances in neural information processing systems 28 (NIPS 2015)

    Google Scholar 

  21. Tai KS, Socher R, Manning CD (2015) Improved semantic representations from tree-structured long short-term memory networks. In: Annual meeting of the association for computational linguistics

    Google Scholar 

  22. Tian Q, Zhang P, Li B (2013) Towards predicting the best answers in community-based question-answering services. In: International AAAI conference on web and social media. AAAI

    Google Scholar 

  23. Van VD, Tahi T, Nghiem M-Q (2017) Combining convolution and recursive neural networks for sentiment analysis. In: SoICT ’17. Vietnam

    Google Scholar 

  24. Wang D, Nyberg E (2015) A long short-term memory model for answer sentence selection in question answering. In: Proceedings of the 53rd annual meeting of the association for computational linguistics 7th international joint conference on natural language processing (Short Papers). Beijing, pp 707–712

    Google Scholar 

  25. Wang T, Wu DJ, Coates A, Ng AY (2012) End-to-end text recognition with convolutional neural networks. In: ICPR. IEEE, New York, pp 3304–3308

    Google Scholar 

  26. Wang X, Jiang W, Luo Z (2016) Combination of convolutional and recurrent neural network for sentiment analysis of short texts. In: Proceedings of COLING 2016, the 26th international conference on computational linguistics: technical papers, pp 2428–2437

    Google Scholar 

  27. Weston J, Chopra S, Bordes A (2015) Memory networks. In: ICLR

    Google Scholar 

  28. Wu Z, Valentini-Botinhao C, Watts O, King S (2015) Deep neural networks employing multi-task learning and stacked bottleneck features for speech synthesis. In: 2015 IEEE international conference on acoustics, speech and signal processing. IEEE Explore, Brisbane, Australia

    Google Scholar 

  29. Yang D, Yang DC, He X, Smola A, Hovy E (2015) Hierarchical attention networks for document classification. Retrieved from https://www.cs.cmu.edu/~hovy/papers/16HLT-hierarchical-attention-networks.pdf

  30. You Q, Luo J, Jin H, Yang J (2009) Robust image sentiment analysis using progressively trained and domain transferred deep networks. In: Twenty-Ninth AAAI conference on artificial intelligence

    Google Scholar 

  31. Yu J, Jiang J (2016) Learning sentence embeddings with auxiliary tasks for cross-domain sentiment classification. In: Proceedings of the 2016 conference on empirical methods in natural language processing. Austin, pp 236–246

    Google Scholar 

  32. Zeng D, Liu K, Lai S, Zhou G, Zhao J (2014) Relation classification via convolutional deep neural network. In: COLING, pp 2335–2344

    Google Scholar 

  33. Zhang L, Wang S, Liu B (2018) Deep learning for sentiment analysis: a survey. Wires Data Mining Knowl Discovery 8(4)

    Google Scholar 

  34. Zhao Z, Lu H, Cai D, He X, Zhuang Y (2017) Microblog sentiment classification via recurrent random walk network learning. In: Proceedings of the twenty-sixth international joint conference on artificial intelligence

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Divya Rajput .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Rajput, D., Verma, S. (2021). An Attention Arousal Space for Mapping Twitter Data. In: Favorskaya, M.N., Mekhilef, S., Pandey, R.K., Singh, N. (eds) Innovations in Electrical and Electronic Engineering. Lecture Notes in Electrical Engineering, vol 661. Springer, Singapore. https://doi.org/10.1007/978-981-15-4692-1_29

Download citation

  • DOI: https://doi.org/10.1007/978-981-15-4692-1_29

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-15-4691-4

  • Online ISBN: 978-981-15-4692-1

  • eBook Packages: EnergyEnergy (R0)

Publish with us

Policies and ethics