Abstract
This chapter provides an introduction to contextualized word embeddings which can be considered the new generation of word (and sense) embeddings. The distinguishing factor here is the sensitivity of a word’s representation to the context: a target word’s embedding can change depending on the context in which it appears. These dynamic embeddings alleviate many of the issues associated with static word embeddings and provide reliable means for capturing semantic and syntactic properties of word usage in context. Despite their young age, contextualized word embeddings have provided significant gains in almost any downstream NLP task to which they have been applied.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Pilehvar, M.T., Camacho-Collados, J. (2021). Contextualized Embeddings. In: Embeddings in Natural Language Processing. Synthesis Lectures on Human Language Technologies. Springer, Cham. https://doi.org/10.1007/978-3-031-02177-0_6
Download citation
DOI: https://doi.org/10.1007/978-3-031-02177-0_6
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-01049-1
Online ISBN: 978-3-031-02177-0
eBook Packages: Synthesis Collection of Technology (R0)eBColl Synthesis Collection 10