Hi @fchollet something intrigues me about the embedding layer : does it use wor2vec, fasttext, glove... i have looked github but nothing. Can you give a hint
-
-
Ah ok i think i got it. so it uses the final output of the network to update embedding weight. So we can not just use embedding layer to do embedding by using sequences like implementing skip gram. That mean we have to learn the embedding during training.
-
Check out Andrew Ng's YouTube series on reccurant neural nets o. youtube and check out the embedding chapters. Best explanation I've come across. The embedding layer is actually the weights of a collection of softmax classifiers.
- Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
