Skip to main content

Deep Learning in Sentiment Analysis

  • Chapter
  • First Online:
Deep Learning in Natural Language Processing

Abstract

Sentiment analysis (also known as opinion mining) is an active research area in natural language processing. The task aims at identifying, extracting, and organizing sentiments from user-generated texts in social networks, blogs, or product reviews. Over the past two decades, many studies in the literature exploit machine learning approaches to solve sentiment analysis tasks from different perspectives. Since the performance of a machine learner heavily depends on the choices of data representation, many studies devote to building powerful feature extractor with domain expertise and careful engineering. Recently, deep learning approaches emerge as powerful computational models that discover intricate semantic representations of texts automatically from data without feature engineering. These approaches have improved the state of the art in many sentiment analysis tasks, including sentiment classification, opinion extraction, fine-grained sentiment analysis, etc. In this paper, we give an overview of the successful deep learning approaches sentiment analysis tasks at different levels.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 139.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 179.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 179.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://twitter.com/.

  2. 2.

    https://www.facebook.com.

  3. 3.

    http://www.imdb.com/.

  4. 4.

    https://www.amazon.com/.

  5. 5.

    https://www.yelp.com/.

  6. 6.

    https://code.google.com/p/word2vec/.

  7. 7.

    In practice, it is time consuming to obtain the document- level sentiment labels via human annotation. Researchers typically leverage the review documents from IMDB, Amazon, and Yelp, and regard the associated rating stars as the sentiment labels.

References

  • Augenstein, I., Rocktäschel, T., Vlachos, A., & Bontcheva, K. (2016). Stance detection with bidirectional conditional encoding. In EMNLP2016 (pp. 876–885).

    Google Scholar 

  • Bai, B., Weston, J., Grangier, D., Collobert, R., Sadamasa, K., Qi, Y., et al. (2010). Learning to rank with (a lot of) word features. Information Retrieval, 13(3), 291–314.

    Article  Google Scholar 

  • Baker, L. D. & McCallum, A. K. (1998). Distributional clustering of words for text classification. In Proceedings of the 21st Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (pp. 96–103). ACM.

    Google Scholar 

  • Bengio, Y., Ducharme, R., Vincent, P., & Jauvin, C. (2003). A neural probabilistic language model. Journal of Machine Learning Research,3(Feb), 1137–1155.

    Google Scholar 

  • Bespalov, D., Bai, B., Qi, Y., & Shokoufandeh, A. (2011). Sentiment classification based on supervised latent n-gram analysis. In Proceedings of the 20th ACM International Conference on Information and Knowledge Management (pp. 375–382). ACM.

    Google Scholar 

  • Bhatia, P., Ji, Y., & Eisenstein, J. (2015). Better document-level sentiment analysis from rst discourse parsing. arXiv:1509.01599.

  • Brown, P. F., Desouza, P. V., Mercer, R. L., Pietra, V. J. D., & Lai, J. C. (1992). Class-based n-gram models of natural language. Computational Linguistics, 18(4), 467–479.

    Google Scholar 

  • Chen, X., Qiu, X., Zhu, C., Wu, S., & Huang, X. (2015). Sentence modeling with gated recursive neural network. In Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing (pp. 793–798). Lisbon, Portugal: Association for Computational Linguistics.

    Google Scholar 

  • Chen, H., Sun, M., Tu, C., Lin, Y., & Liu, Z. (2016). Neural sentiment classification with user and product attention. In Proceedings of EMNLP.

    Google Scholar 

  • Collobert, R. & Weston, J. (2008). A unified architecture for natural language processing: Deep neural networks with multitask learning. In Proceedings of the 25th International Conference on Machine Learning (pp. 160–167). ACM.

    Google Scholar 

  • Collobert, R., Weston, J., Bottou, L., Karlen, M., Kavukcuoglu, K., & Kuksa, P. (2011). Natural language processing (almost) from scratch. Journal of Machine Learning Research,12(Aug), 2493–2537.

    Google Scholar 

  • Conneau, A., Schwenk, H., Barrault, L., & Lecun, Y. (2016). Very deep convolutional networks for natural language processing. arXiv:1606.01781.

  • Deerwester, S., Dumais, S. T., Furnas, G. W., Landauer, T. K., & Harshman, R. (1990). Indexing by latent semantic analysis. Journal of the American Society for Information Science, 41(6), 391.

    Article  Google Scholar 

  • Deng, L. & Wiebe, J. (2015). MPQA 3.0: An entity/event-level sentiment corpus. In HLT-NAACL (pp. 1323–1328).

    Google Scholar 

  • Denil, M., Demiraj, A., Kalchbrenner, N., Blunsom, P., & de Freitas, N. (2014). Modelling, visualising and summarising documents with a single convolutional neural network. arXiv:1406.3830.

  • Ding, X., Liu, B., & Yu, P. S. (2008). A holistic lexicon-based approach to opinion mining. In Proceedings of the 2008 International Conference on Web Search and Data Mining (pp. 231–240). ACM.

    Google Scholar 

  • Dong, L., Wei, F., Tan, C., Tang, D., Zhou, M., & Xu, K. (2014a). Adaptive recursive neural network for target-dependent twitter sentiment classification. In ACL (pp. 49–54).

    Google Scholar 

  • Dong, L., Wei, F., Zhou, M., & Xu, K. (2014b). Adaptive multi-compositionality for recursive neural models with applications to sentiment analysis. In AAAI (pp. 1537–1543).

    Google Scholar 

  • dos Santos, C. & Gatti, M. (2014). Deep convolutional neural networks for sentiment analysis of short texts. In Proceedings of COLING 2014, The 25th International Conference on Computational Linguistics: Technical Papers (pp. 69–78). Dublin, Ireland: Dublin City University and Association for Computational Linguistics.

    Google Scholar 

  • Faruqui, M., Dodge, J., Jauhar, S. K., Dyer, C., Hovy, E., & Smith, N. A. (2014). Retrofitting word vectors to semantic lexicons. arXiv:1411.4166.

  • Gan, Z., Pu, Y., Henao, R., Li, C., He, X., & Carin, L. (2016). Unsupervised learning of sentence representations using convolutional neural networks. arXiv:1611.07897.

  • Ghosh, A., & Veale, D. T. (2016). Fracking sarcasm using neural network. In Proceedings of the 7th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis (pp. 161–169).

    Google Scholar 

  • Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep learning. Cambridge: MIT Press.

    MATH  Google Scholar 

  • Gutmann, M. U., & Hyvärinen, A. (2012). Noise-contrastive estimation of unnormalized statistical models, with applications to natural image statistics. Journal of Machine Learning Research,13(Feb), 307–361.

    Google Scholar 

  • Harris, Z. S. (1954). Distributional structure. Word, 10(2–3), 146–162.

    Google Scholar 

  • He, R., Lee, W. S., Ng, H. T., & Dahlmeier, D. (2017). An unsupervised neural attention model for aspect extraction. In Proceedings of the 55th ACL (pp. 388–397). Vancouver, Canada: Association for Computational Linguistics.

    Google Scholar 

  • Hill, F., Cho, K., & Korhonen, A. (2016). Learning distributed representations of sentences from unlabelled data. In NAACL (pp. 1367–1377).

    Google Scholar 

  • Hu, M. & Liu, B. (2004). Mining and summarizing customer reviews. In Proceedings of the Tenth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 168–177). ACM.

    Google Scholar 

  • Huang, P.-S., He, X., Gao, J., Deng, L., Acero, A., & Heck, L. (2013). Learning deep structured semantic models for web search using clickthrough data. In Proceedings of the 22nd ACM International Conference on Information and Knowledge Management (pp. 2333–2338). ACM.

    Google Scholar 

  • Huang, E. H., Socher, R., Manning, C. D., & Ng, A. Y. (2012). Improving word representations via global context and multiple word prototypes. In Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics: Long Papers-Volume 1 (pp. 873–882). Association for Computational Linguistics.

    Google Scholar 

  • Irsoy, O. & Cardie, C. (2014a). Deep recursive neural networks for compositionality in language. In Advances in neural information processing systems (pp. 2096–2104).

    Google Scholar 

  • Irsoy, O. & Cardie, C. (2014b). Opinion mining with deep recurrent neural networks. In Proceedings of the 2014 EMNLP (pp. 720–728).

    Google Scholar 

  • Johnson, R. & Zhang, T. (2014). Effective use of word order for text categorization with convolutional neural networks. arXiv:1412.1058.

  • Johnson, R. & Zhang, T. (2015). Semi-supervised convolutional neural networks for text categorization via region embedding. In Advances in neural information processing systems (pp. 919–927).

    Google Scholar 

  • Johnson, R. & Zhang, T. (2016). Supervised and semi-supervised text categorization using LSTM for region embeddings. arXiv:1602.02373.

  • Joulin, A., Grave, E., Bojanowski, P., & Mikolov, T. (2016). Bag of tricks for efficient text classification. arXiv:1607.01759.

  • Jurafsky, D. (2000). Speech and language processing. New Delhi: Pearson Education India.

    Google Scholar 

  • Kalchbrenner, N., Grefenstette, E., & Blunsom, P. (2014). A convolutional neural network for modelling sentences. In Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (pp. 655–665), Baltimore, Maryland: Association for Computational Linguistics.

    Google Scholar 

  • Katiyar, A. & Cardie, C. (2016). Investigating LSTMs for joint extraction of opinion entities and relations. In Proceedings of the 54th ACL (pp. 919–929).

    Google Scholar 

  • Kim, Y. (2014). Convolutional neural networks for sentence classification. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP) (pp. 1746–1751). Doha, Qatar: Association for Computational Linguistics.

    Google Scholar 

  • Labutov, I., & Lipson, H. (2013). Re-embedding words. In ACL (Vol. 2, pp. 489–493).

    Google Scholar 

  • Lakkaraju, H., Socher, R., & Manning, C. (2014). Aspect specific sentiment analysis using hierarchical deep learning. In NIPS Workshop on Deep Learning and Representation Learning.

    Google Scholar 

  • Le, Q. V. & Mikolov, T. (2014). Distributed representations of sentences and documents. In ICML (Vol. 14, pp. 1188–1196).

    Google Scholar 

  • Lebret, R., Legrand, J., & Collobert, R. (2013). Is deep learning really necessary for word embeddings?. Idiap: Technical Report.

    Google Scholar 

  • LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436–444.

    Article  Google Scholar 

  • Lei, T., Barzilay, R., & Jaakkola, T. (2015). Molding CNNs for text: Non-linear, non-consecutive convolutions. In Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing (pp. 1565–1575). Lisbon, Portugal: Association for Computational Linguistics.

    Google Scholar 

  • Levy, O. & Goldberg, Y. (2014). Dependency-based word embeddings. In ACL, (Vol. 2, pp. 302–308). Citeseer.

    Google Scholar 

  • Li, J. & Jurafsky, D. (2015). Do multi-sense embeddings improve natural language understanding? arXiv:1506.01070.

  • Li, J., Luong, M.-T., Jurafsky, D., & Hovy, E. (2015). When are tree structures necessary for deep learning of representations? arXiv:1503.00185.

  • Liu, J. & Zhang, Y. (2017). Attention modeling for targeted sentiment. In Proceedings of EACL (pp. 572–577).

    Google Scholar 

  • Liu, P., Joty, S., & Meng, H. (2015). Fine-grained opinion mining with recurrent neural networks and word embeddings. In Proceedings of the 2015 EMNLP (pp. 1433–1443).

    Google Scholar 

  • Liu, B. (2012). Sentiment analysis and opinion mining. Synthesis Lectures on Human Language Technologies, 5(1), 1–167.

    Article  Google Scholar 

  • Lund, K., & Burgess, C. (1996). Producing high-dimensional semantic spaces from lexical co-occurrence. Behavior Research Methods, Instruments, and Computers, 28(2), 203–208.

    Article  Google Scholar 

  • Ma, M., Huang, L., Zhou, B., & Xiang, B. (2015). Dependency-based convolutional neural networks for sentence embedding. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 2: Short Papers) (pp. 174–179), Beijing, China: Association for Computational Linguistics.

    Google Scholar 

  • Maas, A. L., Daly, R. E., Pham, P. T., Huang, D., Ng, A. Y., & Potts, C. (2011). Learning word vectors for sentiment analysis. In Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies-Volume 1 (pp. 142–150). Association for Computational Linguistics.

    Google Scholar 

  • Manning, C. D., Schütze, H., et al. (1999). Foundations of Statistical Natural Language Processing (Vol. 999). Cambridge: MIT Press.

    MATH  Google Scholar 

  • Mikolov, T., Chen, K., Corrado, G., & Dean, J. (2013a). Efficient estimation of word representations in vector space. arXiv:1301.3781.

  • Mikolov, T., Sutskever, I., Chen, K., Corrado, G. S., & Dean, J. (2013b). Distributed representations of words and phrases and their compositionality. In Advances in Neural Information Processing Systems (pp. 3111–3119).

    Google Scholar 

  • Mishra, A., Dey, K., & Bhattacharyya, P. (2017). Learning cognitive features from gaze data for sentiment and sarcasm classification using convolutional neural network. In Proceedings of the 55th ACL (pp. 377–387). Vancouver, Canada: Association for Computational Linguistics.

    Google Scholar 

  • Mnih, A. & Hinton, G. (2007). Three new graphical models for statistical language modelling. In Proceedings of the 24th International Conference on Machine Learning (pp. 641–648). ACM.

    Google Scholar 

  • Mnih, A. & Kavukcuoglu, K. (2013). Learning word embeddings efficiently with noise-contrastive estimation. In Advances in neural information processing systems (pp. 2265–2273).

    Google Scholar 

  • Morin, F. & Bengio, Y. (2005). Hierarchical probabilistic neural network language model. In Aistats (Vol. 5, pp. 246–252). Citeseer.

    Google Scholar 

  • Mou, L., Peng, H., Li, G., Xu, Y., Zhang, L., & Jin, Z. (2015). Discriminative neural sentence modeling by tree-based convolution. In Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing (pp. 2315–2325). Lisbon, Portugal: Association for Computational Linguistics.

    Google Scholar 

  • Nakagawa, T., Inui, K., & Kurohashi, S. (2010). Dependency tree-based sentiment classification using CRFs with hidden variables. In Human Language Technologies: The 2010 Annual Conference of the North American Chapter of the Association for Computational Linguistics (pp. 786–794). Association for Computational Linguistics.

    Google Scholar 

  • Nguyen, T. H. & Shirai, K. (2015). PhraseRNN: Phrase recursive neural network for aspect-based sentiment analysis. In EMNLP (pp. 2509–2514).

    Google Scholar 

  • Paltoglou, G. & Thelwall, M. (2010). A study of information retrieval weighting schemes for sentiment analysis. In Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics (pp. 1386–1395). Association for Computational Linguistics.

    Google Scholar 

  • Pang, B., & Lee, L. (2005). Seeing stars: Exploiting class relationships for sentiment categorization with respect to rating scales. In Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics (pp. 115–124). Association for Computational Linguistics.

    Google Scholar 

  • Pang, B., Lee, L., & Vaithyanathan, S. (2002). Thumbs up?: Sentiment classification using machine learning techniques. In Proceedings of the ACL-02 Conference on Empirical Methods in Natural Language Processing-Volume 10 (pp. 79–86). Association for Computational Linguistics.

    Google Scholar 

  • Pang, B., Lee, L., et al. (2008). Opinion mining and sentiment analysis. Foundations and trends\(^{\textregistered }\). Information Retrieval, 2(1–2), 1–135.

    Article  Google Scholar 

  • Qian, Q., Huang, M., Lei, J., & Zhu, X. (2017). Linguistically regularized LSTM for sentiment classification. In Proceedings of the 55th ACL (pp. 1679–1689). Vancouver, Canada: Association for Computational Linguistics.

    Google Scholar 

  • Qiu, S., Cui, Q., Bian, J., Gao, B., & Liu, T.-Y. (2014). Co-learning of word representations and morpheme representations. In COLING (pp. 141–150).

    Google Scholar 

  • Ren, Y., Zhang, Y., Zhang, M., & Ji, D. (2016a). Context-sensitive twitter sentiment classification using neural network. In AAAI (pp. 215–221).

    Google Scholar 

  • Ren, Y., Zhang, Y., Zhang, M., & Ji, D. (2016b). Improving twitter sentiment classification using topic-enriched multi-prototype word embeddings. In AAAI (pp. 3038–3044).

    Google Scholar 

  • Shen, Y., He, X., Gao, J., Deng, L., & Mesnil, G. (2014). Learning semantic representations using convolutional neural networks for web search. In Proceedings of the 23rd International Conference on World Wide Web (pp. 373–374). ACM.

    Google Scholar 

  • Socher, R., Huval, B., Manning, C. D., & Ng, A. Y. (2012). Semantic compositionality through recursive matrix-vector spaces. In Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning (pp. 1201–1211). Jeju Island, Korea: Association for Computational Linguistics.

    Google Scholar 

  • Socher, R., Perelygin, A., Wu, J., Chuang, J., Manning, C. D., Ng, A., & Potts, C. (2013). Recursive deep models for semantic compositionality over a sentiment treebank. In Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing (pp. 1631–1642). Seattle, Washington, USA: Association for Computational Linguistics.

    Google Scholar 

  • Taboada, M., Brooke, J., Tofiloski, M., Voll, K., & Stede, M. (2011). Lexicon-based methods for sentiment analysis. Computational Linguistics, 37(2), 267–307.

    Article  Google Scholar 

  • Tai, K. S., Socher, R., & Manning, C. D. (2015). Improved semantic representations from tree-structured long short-term memory networks. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers) (pp. 1556–1566). Beijing, China: Association for Computational Linguistics.

    Google Scholar 

  • Tang, D., Qin, B., & Liu, T. (2015a). Document modeling with gated recurrent neural network for sentiment classification. In EMNLP (pp. 1422–1432).

    Google Scholar 

  • Tang, D., Qin, B., & Liu, T. (2015b). Learning semantic representations of users and products for document level sentiment classification. In ACL (Vol. 1, pp. 1014–1023).

    Google Scholar 

  • Tang, D., Qin, B., Feng, X., & Liu, T. (2016a). Effective LSTMs for target-dependent sentiment classification. In Proceedings of COLING, 2016 (pp. 3298–3307).

    Google Scholar 

  • Tang, D., Qin, B., & Liu, T. (2016b). Aspect level sentiment classification with deep memory network. In Proceedings of the 2016 EMNLP (pp. 214–224).

    Google Scholar 

  • Tang, D., Wei, F., Yang, N., Zhou, M., Liu, T., & Qin, B. (2014). Learning sentiment-specific word embedding for twitter sentiment classification. In Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (pp. 1555–1565). Baltimore, Maryland: Association for Computational Linguistics.

    Google Scholar 

  • Tang, D., Wei, F., Qin, B., Yang, N., Liu, T., & Zhou, M. (2016c). Sentiment embeddings with applications to sentiment analysis. IEEE Transactions on Knowledge and Data Engineering, 28(2), 496–509.

    Article  Google Scholar 

  • Teng, Z., & Zhang, Y. (2016). Bidirectional tree-structured lstm with head lexicalization. arXiv:1611.06788.

  • Teng, Z., Vo, D. T., & Zhang, Y. (2016). Context-sensitive lexicon features for neural sentiment analysis. In Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing (pp. 1629–1638). Austin, Texas: Association for Computational Linguistics.

    Google Scholar 

  • Turney, P. D. (2002). Thumbs up or thumbs down?: Semantic orientation applied to unsupervised classification of reviews. In Proceedings of the 40th Annual Meeting on Association for Computational Linguistics (pp. 417–424). Association for Computational Linguistics.

    Google Scholar 

  • Vijayaraghavan, P., Sysoev, I., Vosoughi, S., & Roy, D. (2016). Deepstance at semeval-2016 task 6: Detecting stance in tweets using character and word-level CNNs. In SemEval-2016 (pp. 413–419).

    Google Scholar 

  • Vo, D.-T. & Zhang, Y. (2015). Target-dependent twitter sentiment classification with rich automatic features. In Proceedings of the IJCAI (pp. 1347–1353).

    Google Scholar 

  • Wang, S. & Manning, C. D. (2012). Baselines and bigrams: Simple, good sentiment and topic classification. In Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics: Short Papers-Volume 2 (pp. 90–94). Association for Computational Linguistics.

    Google Scholar 

  • Wang, X., Liu, Y., Sun, C., Wang, B., & Wang, X. (2015). Predicting polarities of tweets by composing word embeddings with long short-term memory. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers) (pp. 1343–1353), Beijing, China: Association for Computational Linguistics.

    Google Scholar 

  • Xiong, S., Zhang, Y., Ji, D., & Lou, Y. (2016). Distance metric learning for aspect phrase grouping. In Proceedings of COLING, 2016 (pp. 2492–2502).

    Google Scholar 

  • Yang, Z., Yang, D., Dyer, C., He, X., Smola, A., & Hovy, E. (2016). Hierarchical attention networks for document classification. In Proceedings of NAACL-HLT (pp. 1480–1489).

    Google Scholar 

  • Yin, W. & Schütze, H. (2015). Multichannel variable-size convolution for sentence classification. In Proceedings of the Nineteenth Conference on Computational Natural Language Learning (pp. 204–214). Beijing, China: Association for Computational Linguistics.

    Google Scholar 

  • Yogatama, D., Faruqui, M., Dyer, C., & Smith, N. A. (2015). Learning word representations with hierarchical sparse coding. In ICML (pp. 87–96).

    Google Scholar 

  • Zarrella, G. & Marsh, A. (2016). Mitre at semeval-2016 task 6: Transfer learning for stance detection. In SemEval-2016 (pp. 458–463).

    Google Scholar 

  • Zhang, R., Lee, H., & Radev, D. R. (2016c). Dependency sensitive convolutional neural networks for modeling sentences and documents. In Proceedings of the 2016 NAACL (pp. 1512–1521). San Diego, California: Association for Computational Linguistics.

    Google Scholar 

  • Zhang, Y., Roller, S., & Wallace, B. C. (2016d). MGNC-CNN: A simple approach to exploiting multiple word embeddings for sentence classification. In Proceedings of the 2016 NAACL (pp. 1522–1527). San Diego, California: Association for Computational Linguistics.

    Google Scholar 

  • Zhang, M., Zhang, Y., & Fu, G. (2016a). Tweet sarcasm detection using deep neural network. In Proceedings of COLING 2016, The 26th International Conference on Computational Linguistics: Technical Papers (pp. 2449–2460). Osaka, Japan: The COLING 2016 Organizing Committee.

    Google Scholar 

  • Zhang, M., Zhang, Y., & Vo, D.-T. (2015a). Neural networks for open domain targeted sentiment. In Proceedings of the 2015 Conference on EMNLP.

    Google Scholar 

  • Zhang, M., Zhang, Y., & Vo, D.-T. (2016b). Gated neural networks for targeted sentiment analysis. In AAAI (pp. 3087–3093).

    Google Scholar 

  • Zhang, X., Zhao, J., & LeCun, Y. (2015b). Character-level convolutional networks for text classification. In Advances in neural information processing systems (pp. 649–657).

    Google Scholar 

  • Zhao, H., Lu, Z., & Poupart, P. (2015). Self-adaptive hierarchical sentence model. arXiv:1504.05070.

  • Zhu, X.-D., Sobhani, P., & Guo, H. (2015). Long short-term memory over recursive structures. In ICML (pp. 1604–1612).

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Duyu Tang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Singapore Pte Ltd.

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Tang, D., Zhang, M. (2018). Deep Learning in Sentiment Analysis. In: Deng, L., Liu, Y. (eds) Deep Learning in Natural Language Processing. Springer, Singapore. https://doi.org/10.1007/978-981-10-5209-5_8

Download citation

  • DOI: https://doi.org/10.1007/978-981-10-5209-5_8

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-10-5208-8

  • Online ISBN: 978-981-10-5209-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics