Skip to main content
Log in

Cognitive Insights into Sentic Spaces Using Principal Paths

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

The availability of an effective embedding to represent textual information is important in commonsense reasoning. Assessing the quality of an embedding is challenging. In most approaches, embeddings are built using statistical properties of the data that are not directly interpretable by a human user. Numerical methods can be inconsistent with respect to the target problem from a cognitive view point. This paper addresses the issue by developing a protocol for evaluating the coherence between an embedding space and a given cognitive model. The protocol uses the recently introduced notion of principal path, which can support the exploration of a high-dimensional space. The protocol provides a qualitative measure of concept distributions in a graphical format, which allows the embedding properties to be analyzed. As a consequence, the tool mitigates the black-box effect that is typical of automatic inference processes. The experimental section involves the characterization of AffectiveSpace, demonstrating that the proposed approach can be used to describe embeddings. The reference cognitive model is the hourglass model of emotions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21

Similar content being viewed by others

References

  1. Cambria E, Poria S, Gelbukh A, Thelwall M. Sentiment analysis is a big suitcase. IEEE Intell Syst 2017;32(6):74–80.

    Article  Google Scholar 

  2. Li Y, Pan Q, Yang T, Wang S, Tang J, Cambria E. Learning word representations for sentiment analysis. Cogn Comput 2017;9(6):843–851.

    Article  Google Scholar 

  3. Ofek N, Poria S, Rokach L, Cambria E, Hussain A, Shabtai A. Unsupervised commonsense knowledge enrichment for domain-specific sentiment analysis. Cogn Comput 2016;8(3):467–477.

    Article  Google Scholar 

  4. Ma Y, Peng H, Khan T, Cambria E, Hussain A. Sentic lstm: a hybrid network for targeted aspect-based sentiment analysis. Cogn Comput 2018;10(4):639–650.

    Article  Google Scholar 

  5. Yang H-C, Lee C-H, Wu C-Y. Sentiment discovery of social messages using self-organizing maps. Cogn Comput 2018;10(6):1152–1166.

    Article  Google Scholar 

  6. Peng H, Cambria E, Hussain A. A review of sentiment analysis research in chinese language. Cogn Comput 2017;9(4):423–435.

    Article  Google Scholar 

  7. Bengio Y, Ducharme R, Vincent P, Jauvin C. A neural probabilistic language model. J Mach Learn Res 2003;3(Feb):1137–1155.

    Google Scholar 

  8. Collobert R, Weston J. A unified architecture for natural language processing: Deep neural networks with multitask learning. In: Proceedings of the 25th International Conference on Machine Learning. ACM; 2008. p. 160–167.

  9. Huang EH, Socher R, Manning CD, Ng AY. Improving word representations via global context and multiple word prototypes. In: Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics: Long Papers-Volume 1, Association for Computational Linguistics; 2012, p. 873–882.

  10. Mikolov T, Chen K, Corrado G, Dean J. Efficient estimation of word representations in vector space, arXiv:1301.3781.

  11. Mnih A, Hinton G. Three new graphical models for statistical language modelling. In: Proceedings of the 24th International Conference on Machine Learning. ACM; 2007. p. 641–648.

  12. Tang J, Qu M, Mei Q. Pte: Predictive text embedding through large-scale heterogeneous text networks. In: Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM; 2015, p. 1165–1174.

  13. Wang S, Tang J, Aggarwal C, Liu H. Linked document embedding for classification. In: Proceedings of the 25th ACM International on Conference on Information and Knowledge Management. ACM; 2016. p. 115–124.

  14. Ma Y, Peng H, Cambria E. Targeted aspect-based sentiment analysis via embedding commonsense knowledge into an attentive LSTM. In: AAAI; 2018. p. 5876–5883.

  15. Pennington J, Socher R, Manning CD. Glove: Global vectors for word representation. In: Empirical Methods in Natural Language Processing (EMNLP); 2014. p. 1532–1543. http://www.aclweb.org/anthology/D14-1162.

  16. Wilson T, Wiebe J, Hoffmann P. Recognizing contextual polarity in phrase-level sentiment analysis. In: Proceedings of the Conference on Human Language Technology and Empirical Methods in Natural Language Processing, Association for Computational Linguistics; 2005. p. 347–354.

  17. Mohammad SM, Turney PD. Crowdsourcing a word–emotion association lexicon. Comput Intell 2013;29 (3):436–465.

    Article  Google Scholar 

  18. Cambria E, Poria S, Hazarika D, Kwok K. SenticNet 5: Discovering conceptual primitives for sentiment analysis by means of context embeddings. In: AAAI; 2018. p. 1795–1802.

  19. Li X, Xie H, Chen L, Wang J, Deng X. News impact on stock price return via sentiment analysis. Knowl-Based Syst 2014;69:14–23.

    Article  Google Scholar 

  20. Cambria E, Fu J, Bisio F, Poria S. Affectivespace 2: Enabling affective intuition for concept-level sentiment analysis.. In: AAAI; 2015. p. 508–514.

  21. Carlsson G. Topology and data. Bull Am Math Soc 2009;46(2):255–308.

    Article  Google Scholar 

  22. Pearson K. Liii. on lines and planes of closest fit to systems of points in space. Lond Edinb Dublin Philos Mag J Sci 1901;2(11):559–572.

    Article  Google Scholar 

  23. Schölkopf B, Smola A, Müller K-R. Kernel principal component analysis. In: International Conference on Artificial Neural Networks. Springer; 1997. p. 583–588.

  24. Roweis ST, Saul LK. Nonlinear dimensionality reduction by locally linear embedding. Science 2000;290 (5500):2323–2326.

    Article  CAS  Google Scholar 

  25. Kruskal JB. Multidimensional scaling by optimizing goodness of fit to a nonmetric hypothesis. Psychometrika 1964;29(1):1–27.

    Article  Google Scholar 

  26. Maaten Lvd, Hinton G. Visualizing data using t-sne. J Mach Learn Res 2008;9(Nov):2579–2605.

    Google Scholar 

  27. Liu S, Maljovec D, Wang B, Bremer P-T, Pascucci V. Visualizing high-dimensional data: Advances in the past decade. IEEE Trans Vis Comput Graph 2017;23(3):1249–1268.

    Article  Google Scholar 

  28. Ragusa E, Gastaldo P, Zunino R, Cambria E. Learning with similarity functions: a tensor-based framework. Cogn Comput 2019;11(1):31–49.

    Article  Google Scholar 

  29. Peng X, Selvachandran G. Pythagorean fuzzy set: state of the art and future directions. Artif Intell Rev. 2017:1–55.

  30. Ferrarotti MJ, Rocchia W, Decherchi S. Finding principal paths in data space. IEEE Transactions on Neural Networks and Learning Systems. 2018:1–14. https://doi.org/10.1109/TNNLS.2018.2884792.

    Article  Google Scholar 

  31. Hastie T, Stuetzle W. Principal curves. J Am Stat Assoc 1989;84(406):502–516.

    Article  Google Scholar 

  32. Plutchik R. The nature of emotions: Human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. Am Sci 2001;89(4):344–350.

    Article  Google Scholar 

  33. Cambria E, Livingstone A, Hussain A. The hourglass of emotions. In: Cognitive Behavioural Systems. Springer; 2012. p. 144–157.

  34. Liu H, Singh P. Conceptnet—a practical commonsense reasoning tool-kit. BT Technol J 2004;22(4):211–226.

    Article  CAS  Google Scholar 

  35. Strapparava C, Valitutti A, et al. Wordnet affect: an affective extension of wordnet. In: Lrec, Vol. 4, Citeseer; 2004. p. 1083–1086.

  36. Cambria E, Poria S, Bajpai R, Schuller B. SenticNet 4: A semantic resource for sentiment analysis based on conceptual primitives. In: COLING; 2016. p. 2666–2677.

  37. Cambria E, Hussain A. Sentic computing: a Common-Sense-Based framework for Concept-Level sentiment analysis. Cham: Springer; 2015.

    Book  Google Scholar 

  38. Bottou L, Bengio Y. Convergence properties of the k-means algorithms. In: Advances in Neural Information Processing Systems; 1995. p. 585–592.

  39. Belkin M, Niyogi P. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput 2003;15(6):1373–1396.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Edoardo Ragusa.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Informed Consent

Informed consent was not required as no human or animal subjects were involved.

Human and Animal Rights

This article does not contain any studies with human or animal subjects performed by any of the authors.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ragusa, E., Gastaldo, P., Zunino, R. et al. Cognitive Insights into Sentic Spaces Using Principal Paths. Cogn Comput 11, 656–675 (2019). https://doi.org/10.1007/s12559-019-09651-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-019-09651-1

Keywords

Navigation