@fchollet Pick a random node and do a random walk N steps through the tree. Each node is a token; each path a sentence, do word2vec.
-
-
-
@jmcorgan Then you are not embedding the tree, you are embedding random paths in the tree... - Show replies
New conversation -
-
-
@fchollet Shift-reduce? Works for neural net learning of dependency trees.Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
@fchollet what do you want the embedding to capture? is it a nat lang syntax tree and you want sentence "semantics",or some tree similarity? -
@fchollet (also, is it arbitrary tree? binary tree?) - Show replies
New conversation -
-
-
@fchollet auto encoder on individual nodes / word2vec then a tree based convolution neural net such as is done in the paper by Lili MouThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.