How many dimensions does our world representation have?
-
-
-
-
Replying to @The_Lagrangian @Plinz
actually, while you will see me rant on the limitations of language, I think there's not that much of a difference
1 reply 0 retweets 0 likes -
some thoughts from the alt, epistemic status explorativepic.twitter.com/NJ7uHUo7Ne
1 reply 0 retweets 1 like -
Replying to @allgebrah @Plinz
wearing the boring hat for a second, dimensionality of word2vec comes from optimizing for particular encoding architecture
1 reply 0 retweets 0 likes -
Replying to @The_Lagrangian @Plinz
I'd actually be interested in the details, the word2vec idea just seemed like a really convenient but sloppy shortcut
2 replies 0 retweets 0 likes -
Replying to @allgebrah
it's not sloppy, it's actually quite clean- just seems unlikely to be analogous to the way humans actually encode words
1 reply 0 retweets 1 like -
Replying to @The_Lagrangian @allgebrah
word2vec just tries to predict a word from immediately surrounding words, this rep makes them small enough to put in an LSTM
1 reply 0 retweets 1 like -
Replying to @The_Lagrangian
'(·) Retweeted '(·)
I have a few idea generation techniques that rely on subtracting one vector and adding anotherhttps://twitter.com/allgebrah/status/709522167017443328 …
'(·) added,
1 reply 0 retweets 0 likes
'(·) Retweeted '(·)
'(·) added,
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.