A framework for learning word vectors (Mikolov et al. 2013)
- Every word in a fixed vocabulary is represented by a vector
- Go through each position t in the text, which has a center word c and context words o (outer)
- Use the similarity of the word vectors for c and o to calculate the probability of o given c]
- Keep adjusting the word vectors to maximize this probability
Word2Vecs