Conversation

I say we compromise. Next asteroid is due in around 100 million years so waiting 1 million years should incur minimal risks. In 1 million years I'm fine with you building AGI. Just hold off until then, okay?
3
48
Show replies
Show replies
You're right. I'm reading Superintelligence right now and Im convinced creating an AGI that doesn't want to destroy humanity is one of the biggest problems no one is talking or thinking about
1
7
Show replies
Show replies
I like to imagine you’re capable of a better argument than this. Asteroid is nowhere in sight and humanity’s current problems are solvable with our own rationality - what’s stopping us is largely politics.