Abstract
Natural Language Processing (NLP) has evolved significantly over the last decade. This paper highlights the most important milestones of this period, while trying to pinpoint the contribution of each individual model and algorithm to the overall progress. Furthermore, it focuses on issues still remaining to be solved, emphasizing on the groundbreaking proposals of Transformers, BERT, and all the similar attention-based models.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
AllenNLP Github repository. https://github.com/allenai/allennlp. Accessed 21 Apr 2021
Common Crawl. https://commoncrawl.org/. Accessed 21 Apr 2021
Eugene goostman. http://eugenegoostman.elasticbeanstalk.com/. Accessed 21 Apr 2021
Flair Github repository. https://github.com/flairNLP/flair. Accessed 21 Apr 2021
Kaggle. https://www.kaggle.com/. Accessed 21 Apr 2021
NLTK Github repository. https://github.com/nltk/nltk. Accessed 21 Apr 2021
spaCy Github repository. https://github.com/explosion/spaCy. Accessed 21 Apr 2021
SparkNLP Github repository. https://github.com/JohnSnowLabs/spark-nlp. Accessed 21 Apr 2021
Stanza Github repository. https://github.com/stanfordnlp/stanza Accessed 21 Apr 2021
Baeza-Yates, R.: Challenges in the interaction of information retrieval and natural language processing. In: Gelbukh, A. (ed.) CICLing 2004. LNCS, vol. 2945, pp. 445–456. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-24630-5_55
Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473 (2014)
Brown, K.: Encyclopedia of Language and Linguistics, vol. 1. Elsevier, Amsterdam (2005)
Brown, T.B., et al.: Language models are few-shot learners. arXiv preprint arXiv:2005.14165 (2020)
Bucaria, C.: Lexical and syntactic ambiguity as a source of humor: the case of newspaper headlines. Humor 17(3), 279–309 (2004)
Chen, Q., Zhu, X., Ling, Z., Wei, S., Jiang, H., Inkpen, D.: Enhanced LSTM for natural language inference. arXiv preprint arXiv:1609.06038 (2016)
Conneau, A., et al.: Unsupervised cross-lingual representation learning at scale. arXiv preprint arXiv:1911.02116 (2019)
Dai, Z., Yang, Z., Yang, Y., Carbonell, J., Le, Q.V., Salakhutdinov, R.: Transformer-xl: Attentive language models beyond a fixed-length context. arXiv preprint arXiv:1901.02860 (2019)
Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)
Dong, L., Wei, F., Zhou, M., Xu, K.: Question answering over freebase with multi-column convolutional neural networks. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp. 260–269 (2015)
Epstein, R., Roberts, G., Beber, G.: Parsing the Turing test. Springer, Berlin (2009)
Goldberg, Y.: A primer on neural network models for natural language processing. J. Artif. Intell. Res. 57, 345–420 (2016)
Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)
Howard, J., Ruder, S.: Universal language model fine-tuning for text classification. arXiv preprint arXiv:1801.06146 (2018)
Jurafsky, D., Manning, C.: Natural language processing. Instructor 212(998), 3482 (2012)
Kalchbrenner, N., Grefenstette, E., Blunsom, P.: A convolutional neural network for modelling sentences. arXiv preprint arXiv:1404.2188 (2014)
Kim, Y.: Convolutional neural networks for sentence classification (2014)
Kurdi, M.Z.: Natural language processing and computational linguistics: speech, morphology and syntax, vol. 1, Wiley, Hoboken (2016)
Lample, G., Conneau, A.: Cross-lingual language model pretraining. arXiv preprint arXiv:1901.07291 (2019)
Lan, Z., Chen, M., Goodman, S., Gimpel, K., Sharma, P., Soricut, R.: Albert: a lite bert for self-supervised learning of language representations. arXiv preprint arXiv:1909.11942 (2019)
Leech, G.N.: Principles of pragmatics. Routledge, London (2016)
Liu, Y., et al.: Roberta: A robustly optimized bert pretraining approach. arXiv preprint arXiv:1907.11692 (2019)
MacDonald, M.C., Pearlmutter, N.J., Seidenberg, M.S.: The lexical nature of syntactic ambiguity resolution. Psychol. Rev. 101(4), 676 (1994)
Machinery, C.: Computing machinery and intelligence-am turing. Mind 59(236), 433 (1950)
Mallamma, V.R., Hanumanthappa, M.: Semantical and syntactical analysis of NLP. Int. J. Comput. Sci. Inf. Technol. 5(3), 3236–3238 (2014)
Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781 (2013)
Mikolov, T., Kombrink, S., Burget, L., Černockỳ, J., Khudanpur, S.: Extensions of recurrent neural network language model. In: 2011 IEEE international conference on acoustics, speech and signal processing (ICASSP), pp. 5528–5531. IEEE (2011)
Mikolov, T., Sutskever, I., Chen, K., Corrado, G., Dean, J.: Distributed representations of words and phrases and their compositionality. arXiv preprint arXiv:1310.4546 (2013)
Otter, D.W., Medina, J.R., Kalita, J.K.: A survey of the usages of deep learning for natural language processing. IEEE Trans. Neural Netw. Learn. Syst. 32, 604–624 (2020)
Pennington, J., Socher, R., Manning, C.D.: Glove: Global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543 (2014)
Peters, M.E., et al.: Deep contextualized word representations. arXiv preprint arXiv:1802.05365 (2018)
Pullum, G.: Philosophy of linguistics. The Cambridge Companion to History of Philosophy 2015, (1945)
Radford, A., Narasimhan, K., Salimans, T., Sutskever, I.: Improving language understanding by generative pre-training (2018)
Radford, A., Wu, J., Child, R., Luan, D., Amodei, D., Sutskever, I.: Language models are unsupervised multitask learners. OpenAI blog 1(8), 9 (2019)
Saenz, A.: Cleverbot chat engine is learning from the internet to talk like a human. Singularity Hub, Accessed 3 June 2021. https://singularityhub.com/2010/01/13/cleverbot-chat-engine-is-learning-from-the-internet-to-talk-like-a-human/
Sanh, V., Debut, L., Chaumond, J., Wolf, T.: Distilbert, a distilled version of bert: smaller, faster, cheaper and lighter. arXiv preprint arXiv:1910.01108 (2019)
Sarlin, P.E., DeTone, D., Malisiewicz, T., Rabinovich, A.: Superglue: Learning feature matching with graph neural networks. In: Proceedings of the IEEE/CVF Conference on Computer vision and Pattern Recognition, pp. 4938–4947 (2020)
Saygin, A.P., Cicekli, I., Akman, V.: Turing test: 50 years later. Minds Mach. 10(4), 463–518 (2000)
Strubell, E., Ganesh, A., McCallum, A.: Energy and policy considerations for deep learning in nlp. arXiv preprint arXiv:1906.02243 (2019)
Sun, Y., et al.: Ernie: Enhanced representation through knowledge integration. arXiv preprint arXiv:1904.09223 (2019)
Sun, Y., et al.: Ernie 2.0: A continual pre-training framework for language understanding. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 8968–8975 (2020)
Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. arXiv preprint arXiv:1409.3215 (2014)
Upadhyay, S., Chang, K.W., Taddy, M., Kalai, A., Zou, J.: Beyond bilingual: Multi-sense word embeddings using multilingual context. arXiv preprint arXiv:1706.08160 (2017)
Vaswani, A., et al.: Attention is all you need. arXiv preprint arXiv:1706.03762 (2017)
von Ahn, L., Blum, M., Hopper, N.J., Langford, J.: CAPTCHA: using hard AI problems for security. In: Biham, E. (ed.) EUROCRYPT 2003. LNCS, vol. 2656, pp. 294–311. Springer, Heidelberg (2003). https://doi.org/10.1007/3-540-39200-9_18
Wang, A., Singh, A., Michael, J., Hill, F., Levy, O., Bowman, S.R.: Glue: A multi-task benchmark and analysis platform for natural language understanding. arXiv preprint arXiv:1804.07461 (2018)
Weizenbaum, J.: Eliza-a computer program for the study of natural language communication between man and machine. Commun. ACM 9(1), 36–45 (1966)
Wolf, T., et al.: Huggingface’s transformers: State-of-the-art natural language processing. arXiv preprint arXiv:1910.03771 (2019)
Yang, Z., Dai, Z., Yang, Y., Carbonell, J., Salakhutdinov, R., Le, Q.V.: Xlnet: Generalized autoregressive pretraining for language understanding. arXiv preprint arXiv:1906.08237 (2019)
Yin, W., Kann, K., Yu, M., Schütze, H.: Comparative study of CNN and RNN for natural language processing. arXiv preprint arXiv:1702.01923 (2017)
Young, T., Hazarika, D., Poria, S., Cambria, E.: Recent trends in deep learning based natural language processing. IEEE Comput. Intell. Mag. 13(3), 55–75 (2018)
Acknowledgements
This work was supported by the MPhil program “Advanced Technologies in Informatics and Computers”, hosted by the Department of Computer Science, International Hellenic University, Kavala, Greece.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 IFIP International Federation for Information Processing
About this paper
Cite this paper
Galanis, NI., Vafiadis, P., Mirzaev, KG., Papakostas, G.A. (2021). Machine Learning Meets Natural Language Processing - The Story so Far. In: Maglogiannis, I., Macintyre, J., Iliadis, L. (eds) Artificial Intelligence Applications and Innovations. AIAI 2021. IFIP Advances in Information and Communication Technology, vol 627. Springer, Cham. https://doi.org/10.1007/978-3-030-79150-6_53
Download citation
DOI: https://doi.org/10.1007/978-3-030-79150-6_53
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-79149-0
Online ISBN: 978-3-030-79150-6
eBook Packages: Computer ScienceComputer Science (R0)