Advertisement

An Efficient Model for Sentiment Analysis of Electronic Product Reviews in Vietnamese

  • Suong N. HoangEmail author
  • Linh V. Nguyen
  • Tai Huynh
  • Vuong T. Pham
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11814)

Abstract

In the past few years, the growth of e-commerce and digital marketing in Vietnam has generated a huge volume of opinionated data. Analyzing those data would provide enterprises with insight for better business decisions. In this work, as part of the Advosights project, we study sentiment analysis of product reviews in Vietnamese. The final solution is based on Self-attention neural networks, a flexible architecture for text classification task with about \(90.16\%\) of accuracy in 0.0124 second, a very fast inference time.

Keywords

Vietnamese Sentiment analysis Electronics product review 

Notes

Acknowledgment

We thank our teammates, Tran A. Sang, Cao T. Thanh, and Ha H. Huy for helpful discussions and supports.

References

  1. 1.
    Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRefGoogle Scholar
  2. 2.
    Schuster, M., Paliwal, K.K.: Bidirectional recurrent neural networks. IEEE Trans. Sig. Process. 45(11), 2673–2681 (1997)CrossRefGoogle Scholar
  3. 3.
    Schneider, W.X.: An Introduction to “Mechanisms of Visual Attention: A Cognitive Neuroscience Perspective” (1998) https://pdfs.semanticscholar.org/b719/918bdf2e71571a3cbb2a6aaaec3f1b6af9e6.pdf
  4. 4.
    Esuli, A., Sebastiani, F.: Senti-wordnet: a publicly available lexical resource for opinion mining. In: Proceedings of LREC, vol. 6, pp. 417–422 (2006)Google Scholar
  5. 5.
    Baccianella, S., Esuli, A., Sebastiani, F.: Sentiwordnet 3.0: an enhanced lexical resource for sentiment analysis and opinion mining. In: Proceedings of LREC, vol. 10, pp. 2200–2204 (2010)Google Scholar
  6. 6.
    Mohammad, S.M., Kiritchenko, S., Zhu, X.: NRC-canada: building the state-of-the-art in sentiment analysis of tweets. In: Proceedings of SemEval-2013 (2013)Google Scholar
  7. 7.
    Mnih, V., et al.: Recurrent models of visual attention. In: Neural Information Processing Systems Conference (NIPS) (2014). arXiv preprint arXiv:1406.6247
  8. 8.
    Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. accepted in International Conference on Learning Representations (ICLR) (2015). arXiv preprint arXiv:1409.0473, 2014
  9. 9.
    Chung, J., Gulcehre, C., Cho, K., Bengio, Y.: Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling. arXiv preprint arXiv:1412.3555 (2014)
  10. 10.
    Cheng, J., Dong, L., Lapata, M.: Long short-term memory-networks for machine reading. Computing Research Repository (CoRR) (2016). arXiv preprint arXiv:1601.06733
  11. 11.
    Lu, J., Yang, J., Batra, D., Parikh, D.: Hierarchical question-image co-attention for visual question answering. In: Advances in Neural Information Processing Systems 29, pp. 289–297. Curran Associates Inc. (2016)Google Scholar
  12. 12.
    Vo, D.T., Zhang, Y.: Don’ t Count, Predict! An Automatic Approach to Learning Sentiment Lexicons for Short Text (2016). https://www.aclweb.org/anthology/P16-2036
  13. 13.
    Bojanowski, P., Grave, E., Joulin, A., Mikolov, T.: Enriching Word Vectors with Subword Information. arXiv preprint arXiv:1607.04606 (2016)
  14. 14.
    Kokkinos, F., Potamianos, A.: Structural attention neural networks for improved sentiment analysis. arXiv preprint arXiv:1701.01811 (2017)
  15. 15.
    Daniluk, M., Rocktaschel, T., Welbl, J., Riedel, S.: Frustratingly short attention spans in neural language modeling. arXiv preprint arXiv:1702.04521 (2017)
  16. 16.
    Vaswani, A., et al.: Attention is all you need. In: Guyon, I., et al. (eds.) Advances in Neural Information Processing Systems 30, pp. 5998–6008. Curran Associates Inc. (2017). arXiv preprint arXiv:1706.03762 (2017). http://papers.nips.cc/paper/7181-attention-is-all-you-need.pdf
  17. 17.
    Gehring, J., Auli, M., Grangier, D., Yarats, D., Dauphin, Y.N.: Convolutional sequence to sequence learning. arXiv preprint arXiv:1705.03122 (2017)
  18. 18.
    Hu, J., Shen, L., Albanie, S., Sun, G., Wu, E.: Squeeze-and-Excitation Networks. arXiv preprint arXiv:1709.01507 (2017)
  19. 19.
    Zhou, Y., Zhou, J., Liu, L., Feng, J., Peng, H., Zheng, X.: RNN-based sequence-preserved attention for dependency parsing (2018). https://www.aaai.org/ocs/index.php/AAAI/AAAI18/paper/view/17176
  20. 20.
    Anh, V., et al.: Underthesea. https://github.com/undertheseanlp/underthesea
  21. 21.
    Natural Language Toolkit. https://www.nltk.org/

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Kyanon DigitalHo Chi Minh CityVietnam
  2. 2.AdvosightsHo Chi Minh CityVietnam
  3. 3.Saigon UniversityHo Chi Minh CityVietnam

Personalised recommendations