either we figure out how to make AGI go well or we wait for the asteroid to hit
Conversation
i am optimistic we can do the former, and i do not believe we can colonize space without AGI
66
22
428
I say we compromise. Next asteroid is due in around 100 million years so waiting 1 million years should incur minimal risks. In 1 million years I'm fine with you building AGI. Just hold off until then, okay?
3
48
you do not want OpenAI to cede its current advantage to actors that don't care about alignment
5
1
19
Show replies
When you say “wait for the asteroid to hit” is bad AGI a possible “asteroid”. Agree with you largely on this statement.
Show replies
You're right.
I'm reading Superintelligence right now and Im convinced creating an AGI that doesn't want to destroy humanity is one of the biggest problems no one is talking or thinking about
1
1
7
Yeah, this could be the biggest problem facing humanity right now.
AI is like global warming but more urgent and more impactful.
1
1
Show replies
Show replies
we might implode before any external threat hits us if current vibes continue for long..
1
I like to imagine you’re capable of a better argument than this. Asteroid is nowhere in sight and humanity’s current problems are solvable with our own rationality - what’s stopping us is largely politics.









