If you believe in IQ you want the highest IQ compatible you can have without having attendant mental health issues elsewhere.
But YouQ should ideally be 100. Exactly smart enough for your own good. Not more or less.
Conversation
This is why I really like space exploration. Nobody has ever come up with a non-bullshit objective external reason to do it. The only reason to do it is internal. You can’t be too clever for your own good in space exploration because it’s worth nothing externally.
3
1
19
I suspect this line of thought leads many smart people to religion if they add one fatalist leap of faith: that the universe has arranged the perfect set of life challenges for your maximal spiritual development and trying to be clever with risk taking is cosmic foolishness.
2
6
This I don’t buy. The universe is pretty random. The challenges life throws at you have no intrinsic meaning. Unfortunately you actually have to choose when to take the easy-courageous way around, and when to take the hard-submissive way through.
2
1
7
Many tragicomic paths begin with the hope that life is meaningful enough that surrendering agency and doubt is a metaphysically smart thing to do. Hence the focus on surrender and submission in all religions.
3
10
Not that it can’t lead to brilliantly wrong metaphysics. Leibniz was a risk-taking royal-ass-kissing, solipsistic hustler, but also seemed to be genuinely religious in exactly this sense. He believed the universe was fractally optimal at every instant down to the last monad.
1
6
Leibniz was possibly the ultimate “too clever for his own good” guy. So clever he invented calculus and computers and for an encore fooled himself into believing Spinoza was wrong with an intricately wonderful bullshit-vitalist metaphysics (monadology)
1
1
6
Probably a good bit of commencement speech type advice for kids would be: pick one useless thing you’re neither going to get clever with, nor let others dictate how you pursue, and design life around reserving your peak hours for that.
3
8
31
This Tweet is from an account that no longer exists. Learn more
I think Newport’s fatal mistake is trying to define deep work in terms of external valuation references. So he tries to do the most societally important things he’s capable of instead of the most free things. And so he lands on inevitably silly waldenponding solutions.
