Discuss this post on Hacker News Word embeddings are ways of mathematically representing natural language words in a manner that preserves the semantic and syntactic similarities between them. This is accomplished through representing words as high-dimensional vectors: the spatial relationship between these embeddings then represent the relationships between words. For example, the representations