Texonom
Texonom
/
Engineering
Engineering
/Data Engineering/Artificial Intelligence/AI Object/NLP/Text embedding/Word Embedding/
Word2Vec
Search

Word2Vec

Creator
Creator
Seonglae Cho
Created
Created
2023 Mar 10 10:33
Editor
Editor
Seonglae Cho
Edited
Edited
2025 Mar 12 12:52
Refs
Refs
Distributional semantics
Glove Embedding

A framework for learning word vectors (Mikolov et al. 2013)

  • Every word in a fixed vocabulary is represented by a vector
  • Go through each position t in the text, which has a center word c and context words o (outer)
  • Use the similarity of the word vectors for c and o to calculate the probability of o given c]
  • Keep adjusting the word vectors to maximize this probability
Word2Vec Variants
Skip-gram
FastText
Doc2Vec
CBOW
notion image
 
 
 
 
UK Twitter word embeddings (II)
Word embeddings trained on UK Twitter content (II)The total number of tweets used was approximately 1.1 billion, covering the years 2012 to and including 2016.Settings: Skip-gram with negative sampling (10 noise words), a window of 9 words, dimensionality of 512, and 10 epochs of training. After filtering out words with less than 100 occurrences, an embedding corpus of 470,194 unigrams was obtained (see embd_voc). The corresponding 512-dimensional embeddings are held in embd_vec.bz2.
UK Twitter word embeddings (II)
https://figshare.com/articles/dataset/UK_Twitter_word_embeddings_II_/5791650
word2vec  |  TensorFlow Core
word2vec  |  TensorFlow Core
https://www.tensorflow.org/tutorials/text/word2vec?hl=ko
word2vec  |  TensorFlow Core
 
 

Recommendations

Texonom
Texonom
/
Engineering
Engineering
/Data Engineering/Artificial Intelligence/AI Object/NLP/Text embedding/Word Embedding/
Word2Vec
Copyright Seonglae Cho