Advertisement

An Algorithm for Text Prediction Using Neural Networks

  • Bindu K. R.Email author
  • Aakash C.
  • Bernett Orlando
  • Latha Parameswaran
Conference paper
Part of the Lecture Notes in Computational Vision and Biomechanics book series (LNCVB, volume 28)

Abstract

Neural networks have become increasingly popular for the task of language modeling. Whereas feed-forward networks only exploit a fixed context length to predict the next word of a sequence, conceptually, standard recurrent neural networks can take into account all of the predecessor words. In this paper, an algorithm using machine learning which, when given a dataset of conversations, is able to train itself using Neural Networks which can then be used to get suggestions for replies for any particular input sentence is proposed. Currently, smart suggestions have been implemented in chat applications. Google uses similar techniques to provide smart replies in Email through which the user can reply to a particular email with just a single tap.

Keywords

Recurrent neural networks Text suggestions LSTM Machine learning 

References

  1. 1.
    Sundermeyer, M., Schlüter, R., Ney, H.: LSTM Neural Networks for Language Modeling. In: Interspeech, pp. 194–197 (2012)Google Scholar
  2. 2.
    Mikolov, T., et al.: Recurrent neural network based language model. In: Interspeech, vol. 2, pp. 1045–1048 (2010)Google Scholar
  3. 3.
    Anlauf, J.K., Biehl, M.: The adatron: an adaptive perceptron algorithm. EPL (Europhysics Letters) 10(7), 687 (1989)CrossRefGoogle Scholar
  4. 4.
    Bindu, K.R., Parameswaran, L., Soumya, K.V.: Performance evaluation of topic modelling algorithms with an application of Q & A dataset. Int. J. Appl. Eng. Res. 10, 23–27 (2015)Google Scholar
  5. 5.
    Bindu, K.R., Parameswaran, L., Nambiar, S.R., Chandran, J.: Performance evaluation of algorithms for expert finding on an open email dataset. Int. J. Appl. Eng. Res. 10, 71–75 (2015)Google Scholar
  6. 6.
    Bebis, G., Georgiopoulos, M.: Feed-forward neural networks. IEEE Potentials 13(4), 27–31(1994)Google Scholar
  7. 7.
    Jarmo, I., Kamarainen, J.K., Lampinen, J.: Differential evolution training algorithm for feed-forward neural networks. Neural Process. Lett. 17(1), 93–105 (2003)Google Scholar
  8. 8.
    Baddeley, A.D., Thomson, N., Buchan, M.: Word length and the structure of short-term memory. J. Verbal Learn. Verbal Behavior 14(6), pp. 575–589 (1975)  Google Scholar
  9. 9.
    Funahashi, K.-I., Nakamura, Y.: Approximation of dynamical systems by continuous time re current neural networks. Neural Netw. 6(6), 801–806 (1993)CrossRefGoogle Scholar
  10. 10.
    Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRefGoogle Scholar
  11. 11.
    Jizhou, J., Zhou, M., Yang, D.: Extracting chatbot knowledge from online discussion forums. IJCAI 7, pp. 423–428  (2007)  Google Scholar
  12. 12.
    Jiyou, J.: The study of the application of a web-based chatbot system on the teaching of foreign languages. In: Proceedings of SITE, vol. 4, pp. 1201–1207 (2004)  Google Scholar

Copyright information

© Springer International Publishing AG  2018

Authors and Affiliations

  • Bindu K. R.
    • 1
    • 2
    Email author
  • Aakash C.
    • 1
    • 2
  • Bernett Orlando
    • 1
    • 2
  • Latha Parameswaran
    • 1
    • 2
  1. 1.Department of Computer Science and EngineeringAmrita School of EngineeringCoimbatoreIndia
  2. 2.Amrita Vishwa VidyapeethamCoimbatoreIndia

Personalised recommendations