Hi @fchollet something intrigues me about the embedding layer : does it use wor2vec, fasttext, glove... i have looked github but nothing. Can you give a hint
-
-
Ok. but what i mean is when doing embedding like word2vec skip-gram in tf we need to prepare a traing data set with target and context with a window size and an embedding is learned with a softmax classifier for example. But How doese the embedding layer learn the embedding

-
Through backpropagation. No different than any other layer.
- Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.