Conversation

have you spent 3 hours or more total researching potential risks of agi (generalized artificial intelligence)? || you'd estimate the chances of agi causing the end of human civilization to be above or below 1% in the next 30 years
  • no || above
    11.9%
  • no || below
    37.8%
  • yes || above
    20.7%
  • yes || below
    29.6%
3,175 votesFinal results
Show replies
Replying to
Having spent way too many hours worrying about AI risk, I think a lot of people are seriously underestimating how dangerous it is and how close we are to building something that easily wipes us out.
1
Show replies
Replying to
AGI is sci-fi. It's just stupid to connect WMDs or similar weapons to an AI, as shown by WarGames (1983). More likely it will be a conventional WW3 being triggered by resource conflicts or climate change that will impact humanity the most in the coming decades.
1
3
Replying to
I read most of an advanced text on AI. Precisely because it can outwit us, no-one knows how it works. We are using AI for weapons systems and we compete for energy as a common resource. If we allow AI to build AI systems, both The Terminator and The Matrix are on the horizon.
1