Abstract
In the last chapter, you used the SEQ2SEQ model along with the Attention to perform a language translation. In this chapter, I will show you a more sophisticated technique for Natural Language Processing. You will learn to use the latest innovation in natural language modeling called Transformer. The Transformer model eliminates the need for LSTMs and produces far better results than the SEQ2SEQ model that uses LSTMs. So, let us understand what a Transformer model is.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2021 Poornachandra Sarang
About this chapter
Cite this chapter
Sarang, P. (2021). Natural Language Understanding. In: Artificial Neural Networks with TensorFlow 2. Apress, Berkeley, CA. https://doi.org/10.1007/978-1-4842-6150-7_9
Download citation
DOI: https://doi.org/10.1007/978-1-4842-6150-7_9
Published:
Publisher Name: Apress, Berkeley, CA
Print ISBN: 978-1-4842-6149-1
Online ISBN: 978-1-4842-6150-7
eBook Packages: Professional and Applied ComputingApress Access BooksProfessional and Applied Computing (R0)