Perhaps a 150-dimensional vector space for representing others, with eigenbasis defined by stereotypes
Conversation
actually, while you will see me rant on the limitations of language, I think there's not that much of a difference
1
wearing the boring hat for a second, dimensionality of word2vec comes from optimizing for particular encoding architecture
1
I'd actually be interested in the details, the word2vec idea just seemed like a really convenient but sloppy shortcut
Replying to
it's not sloppy, it's actually quite clean- just seems unlikely to be analogous to the way humans actually encode words
1
1
word2vec just tries to predict a word from immediately surrounding words, this rep makes them small enough to put in an LSTM
1
1
Show replies
Replying to
now all of us have come up with that one at least once :)
Quote Tweet
world words, holographic carriers of meaning from which you can reconstruct the entire book, language, society, universe



