Strikes me I’ve never plotted all my writing on a 2x2. Wonder what the non-obvious interesting axes would be 🤔
Amount of abuse of language vs degree of attempted pill-iness?
2x2 axes are folk eigenvectors
Conversation
Replying to
I mean... Word embeddings mean you don't need to pick the axes, principal components are nice. Want help?
1
Replying to
You mean literally do a brute force analysis on a giant 1+ million word vector?
1
Replying to
You pass all your words in each article into the RNN and it outputs a 1024 dimension vector for each article.
Then PCA that.
1
Replying to
A while back made this internal-linking graph, which is a different approach to detecting structure I guess

