Advertisement

Assessment of Table Pruning and Semantic Interpretation for Sentiment Analysis Using BRAE Algorithm

  • G. V. ShilpaEmail author
  • D. R. Shashi Kumar
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 1040)

Abstract

We propose bilingually compelled recursive auto-encoders (BRAE) to learn semantic expression embedding (smaller vector portrayals for phrases), which can recognize the expressions with various semantic implications. The BRAE is prepared in a way that limits the semantic separation of interpretation counterparts. Also, it augments the semantic separation of non-translation combinations at the same time. The model identifies how to insert each expression semantically in two dialects and also identifies how to change semantic inserting space in one dialect to the other. We assess our proposed strategy on two end-to-end SMT assignments (express table pruning and interpreting with phrasal semantic likenesses) which need to quantify semantic likeness between a source expression and its interpretation. The detailed tests demonstrate that the BRAE is strikingly compelling in these two assignments.

Keywords

BRAE algorithm Deep neural networks Recursive auto-encoder SMT Semantic analysis Deep neural networks (DNN) 

References

  1. 1.
    Zhang, J., Liu, S., Li, M., Zhou, M., Zong, C.: Bilingually-constrained Phrase Embeddings for Machine Translation. National Laboratory of Pattern Recognition, CASIA, Beijing, P.R. ChinaGoogle Scholar
  2. 2.
    Auli, M., Galley, M., Quirk, C., Zweig, G.: Joint language and translation modeling with recurrent neural networks. In: Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, pp. 1044–1054 (2013)Google Scholar
  3. 3.
    Li, P., Liu, Y., Sun, M.: Recursive autoencoders for ITG-based translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing (2013)Google Scholar
  4. 4.
    Liu, L., Watanabe, T., Sumita, E., Zhao, T.: Additive neural networks for statistical machine translation. In: 51st Annual Meeting of the Association for Computational Linguistics, pp. 791–801 (2013)Google Scholar
  5. 5.
    Duh, K., Neubig, G., Sudoh, K., Tsukada, H.: Adaptation data selection using neural language models: experiments in machine translation. In: 51st Annual Meeting of the Association for Computational Linguistics, pp. 678–683 (2013)Google Scholar
  6. 6.
    Gao, J., He, X., Yih, W.-T., Deng, L.: Learning semantic representations for the phrase translation model. arXiv preprint arXiv:1312.0482 (2013)
  7. 7.
    Vaswani, A., Zhao, Y., Fossum, V., Chiang, D.: Decoding with largescale neural language models improves translation. In: Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, pp. 1387–1392 (2013)Google Scholar
  8. 8.
    Yang, N., Liu, S., Li, M., Zhou, M., Yu, N.: Word alignment modeling with context dependent deep neural network. In: 51st Annual Meeting of the Association for Computational Linguistics (2013)Google Scholar
  9. 9.
    Zou, W.Y., Socher, R., Cer, D., Manning, C.D.: Bilingual word embeddings for phrase-based machine translation. In: Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, pp. 1393–1398 (2013)Google Scholar
  10. 10.
    Zens, R., Stanton, D., Xu, P.: A systematic comparison of phrase table pruning techniques. In: Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, pp. 972–983 (2012)Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2020

Authors and Affiliations

  1. 1.Department of Computer Science and EngineeringVemana Institute of TechnologyBangaloreIndia
  2. 2.Department of Computer Science and EngineeringCambridge Institute of TechnologyBangaloreIndia

Personalised recommendations