Abstract
Section 1.2 briefly discussed the Vector Space Model (VSM). We saw in that section how objects can be represented using continuous vectors in a multidimensional space and how distances in this space can denote the similarities between objects. However, we did not discuss how these spaces are constructed. In other words, the following question remained unanswered: how can we place hundreds of thousands of words in a space such that their positioning corresponds to their semantic properties? In this chapter, we will talk about the foundations behind constructing semantic spaces, particularly for words.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Pilehvar, M.T., Camacho-Collados, J. (2021). Word Embeddings. In: Embeddings in Natural Language Processing. Synthesis Lectures on Human Language Technologies. Springer, Cham. https://doi.org/10.1007/978-3-031-02177-0_3
Download citation
DOI: https://doi.org/10.1007/978-3-031-02177-0_3
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-01049-1
Online ISBN: 978-3-031-02177-0
eBook Packages: Synthesis Collection of Technology (R0)eBColl Synthesis Collection 10