Cambria E, Poria S, Gelbukh A, Thelwall M. Sentiment analysis is a big suitcase. IEEE Intell Syst 2017;32(6):74–80.
Article
Google Scholar
Li Y, Pan Q, Yang T, Wang S, Tang J, Cambria E. Learning word representations for sentiment analysis. Cogn Comput 2017;9(6):843–851.
Article
Google Scholar
Ofek N, Poria S, Rokach L, Cambria E, Hussain A, Shabtai A. Unsupervised commonsense knowledge enrichment for domain-specific sentiment analysis. Cogn Comput 2016;8(3):467–477.
Article
Google Scholar
Ma Y, Peng H, Khan T, Cambria E, Hussain A. Sentic lstm: a hybrid network for targeted aspect-based sentiment analysis. Cogn Comput 2018;10(4):639–650.
Article
Google Scholar
Yang H-C, Lee C-H, Wu C-Y. Sentiment discovery of social messages using self-organizing maps. Cogn Comput 2018;10(6):1152–1166.
Article
Google Scholar
Peng H, Cambria E, Hussain A. A review of sentiment analysis research in chinese language. Cogn Comput 2017;9(4):423–435.
Article
Google Scholar
Bengio Y, Ducharme R, Vincent P, Jauvin C. A neural probabilistic language model. J Mach Learn Res 2003;3(Feb):1137–1155.
Google Scholar
Collobert R, Weston J. A unified architecture for natural language processing: Deep neural networks with multitask learning. In: Proceedings of the 25th International Conference on Machine Learning. ACM; 2008. p. 160–167.
Huang EH, Socher R, Manning CD, Ng AY. Improving word representations via global context and multiple word prototypes. In: Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics: Long Papers-Volume 1, Association for Computational Linguistics; 2012, p. 873–882.
Mikolov T, Chen K, Corrado G, Dean J. Efficient estimation of word representations in vector space, arXiv:1301.3781.
Mnih A, Hinton G. Three new graphical models for statistical language modelling. In: Proceedings of the 24th International Conference on Machine Learning. ACM; 2007. p. 641–648.
Tang J, Qu M, Mei Q. Pte: Predictive text embedding through large-scale heterogeneous text networks. In: Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM; 2015, p. 1165–1174.
Wang S, Tang J, Aggarwal C, Liu H. Linked document embedding for classification. In: Proceedings of the 25th ACM International on Conference on Information and Knowledge Management. ACM; 2016. p. 115–124.
Ma Y, Peng H, Cambria E. Targeted aspect-based sentiment analysis via embedding commonsense knowledge into an attentive LSTM. In: AAAI; 2018. p. 5876–5883.
Pennington J, Socher R, Manning CD. Glove: Global vectors for word representation. In: Empirical Methods in Natural Language Processing (EMNLP); 2014. p. 1532–1543. http://www.aclweb.org/anthology/D14-1162.
Wilson T, Wiebe J, Hoffmann P. Recognizing contextual polarity in phrase-level sentiment analysis. In: Proceedings of the Conference on Human Language Technology and Empirical Methods in Natural Language Processing, Association for Computational Linguistics; 2005. p. 347–354.
Mohammad SM, Turney PD. Crowdsourcing a word–emotion association lexicon. Comput Intell 2013;29 (3):436–465.
Article
Google Scholar
Cambria E, Poria S, Hazarika D, Kwok K. SenticNet 5: Discovering conceptual primitives for sentiment analysis by means of context embeddings. In: AAAI; 2018. p. 1795–1802.
Li X, Xie H, Chen L, Wang J, Deng X. News impact on stock price return via sentiment analysis. Knowl-Based Syst 2014;69:14–23.
Article
Google Scholar
Cambria E, Fu J, Bisio F, Poria S. Affectivespace 2: Enabling affective intuition for concept-level sentiment analysis.. In: AAAI; 2015. p. 508–514.
Carlsson G. Topology and data. Bull Am Math Soc 2009;46(2):255–308.
Article
Google Scholar
Pearson K. Liii. on lines and planes of closest fit to systems of points in space. Lond Edinb Dublin Philos Mag J Sci 1901;2(11):559–572.
Article
Google Scholar
Schölkopf B, Smola A, Müller K-R. Kernel principal component analysis. In: International Conference on Artificial Neural Networks. Springer; 1997. p. 583–588.
Roweis ST, Saul LK. Nonlinear dimensionality reduction by locally linear embedding. Science 2000;290 (5500):2323–2326.
CAS
Article
Google Scholar
Kruskal JB. Multidimensional scaling by optimizing goodness of fit to a nonmetric hypothesis. Psychometrika 1964;29(1):1–27.
Article
Google Scholar
Maaten Lvd, Hinton G. Visualizing data using t-sne. J Mach Learn Res 2008;9(Nov):2579–2605.
Google Scholar
Liu S, Maljovec D, Wang B, Bremer P-T, Pascucci V. Visualizing high-dimensional data: Advances in the past decade. IEEE Trans Vis Comput Graph 2017;23(3):1249–1268.
Article
Google Scholar
Ragusa E, Gastaldo P, Zunino R, Cambria E. Learning with similarity functions: a tensor-based framework. Cogn Comput 2019;11(1):31–49.
Article
Google Scholar
Peng X, Selvachandran G. Pythagorean fuzzy set: state of the art and future directions. Artif Intell Rev. 2017:1–55.
Ferrarotti MJ, Rocchia W, Decherchi S. Finding principal paths in data space. IEEE Transactions on Neural Networks and Learning Systems. 2018:1–14. https://doi.org/10.1109/TNNLS.2018.2884792.
Article
Google Scholar
Hastie T, Stuetzle W. Principal curves. J Am Stat Assoc 1989;84(406):502–516.
Article
Google Scholar
Plutchik R. The nature of emotions: Human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. Am Sci 2001;89(4):344–350.
Article
Google Scholar
Cambria E, Livingstone A, Hussain A. The hourglass of emotions. In: Cognitive Behavioural Systems. Springer; 2012. p. 144–157.
Liu H, Singh P. Conceptnet—a practical commonsense reasoning tool-kit. BT Technol J 2004;22(4):211–226.
CAS
Article
Google Scholar
Strapparava C, Valitutti A, et al. Wordnet affect: an affective extension of wordnet. In: Lrec, Vol. 4, Citeseer; 2004. p. 1083–1086.
Cambria E, Poria S, Bajpai R, Schuller B. SenticNet 4: A semantic resource for sentiment analysis based on conceptual primitives. In: COLING; 2016. p. 2666–2677.
Cambria E, Hussain A. Sentic computing: a Common-Sense-Based framework for Concept-Level sentiment analysis. Cham: Springer; 2015.
Book
Google Scholar
Bottou L, Bengio Y. Convergence properties of the k-means algorithms. In: Advances in Neural Information Processing Systems; 1995. p. 585–592.
Belkin M, Niyogi P. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput 2003;15(6):1373–1396.
Article
Google Scholar