If AI Risk is priority now, why? Foom: 1 AI takes over world, Value Drift: default future has bad values, or Collapse: property rights fail
-
-
Replying to @robinhanson
Hard to answer. Suspect 1 AI fooms; >1 AI could foom near-simultaneously, and I would expect them all to have non-human-like values.
2 replies 0 retweets 1 like -
Replying to @ModelOfTheory
Then I'd say your main concern is that change makes for bad values. You aren't very concerned about 1 vs >1 AI rules world.
1 reply 0 retweets 1 like
Replying to @robinhanson
Ok, I think I better understand how you intended the question now, and have voted accordingly.
8:25 AM - 4 Aug 2017
0 replies
0 retweets
2 likes
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.