If AI Risk is priority now, why? Foom: 1 AI takes over world, Value Drift: default future has bad values, or Collapse: property rights fail
-
-
Then I'd say your main concern is that change makes for bad values. You aren't very concerned about 1 vs >1 AI rules world.
-
Ok, I think I better understand how you intended the question now, and have voted accordingly.
End of conversation
New conversation -
-
-
Both these scenarios would likely involve AI or AIs not respecting property rights. So, all three?
-
You can be concerned about a lack of property right respect even if you didn't think 1 AI takes over, and even if value drift not bother you
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.