the thing most worth calling sperg disease or rationalist disease, the most harmful cognitive attractor common to the type, is profound emotional attachment to an idea of oneself as a single-agent system thickly armored in performative unemotionality of course
-
-
The risky AIs are presumed to be able to have sophisticated subgoals and meta-goals (eg they can use their vast resources to persuade people to do things). Yet their own top goal remains fixed, singular, and stupid. This seems... unworkable? Wrong at some fundamental level?
-
When an A.I keeps hauling water like a broom man it's usually the fault of a crappy mage who wanted power without having to first overcome fear. Bret Weinstein coined the term Autonomous Economic Strategy. AES's are the most dangerous A I's cause we forgot we made them.pic.twitter.com/EpK4SCQkOE
This media may contain sensitive material. Learn more
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.