Perhaps a 150-dimensional vector space for representing others, with eigenbasis defined by stereotypeshttps://twitter.com/grognor/status/798220350299377665 …
I'd actually be interested in the details, the word2vec idea just seemed like a really convenient but sloppy shortcut
-
-
it's not sloppy, it's actually quite clean- just seems unlikely to be analogous to the way humans actually encode words
-
word2vec just tries to predict a word from immediately surrounding words, this rep makes them small enough to put in an LSTM
- 2 more replies
New conversation -
-
-
Not word2vec, but world2vec
@The_Lagrangian -
now all of us have come up with that one at least once :)https://twitter.com/allgebrah/status/759901117631168516 …
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.