If the humanity must end, I wish it were caused by superhuman AIs. I believe those AIs would use the most efficient (and thus least painful) way to get rid of human beings (e.g. using the "disintegrator" in Asimov's The Tercentenary Incident?).
-
-
Do you think the motivation model you are studying has a chance to equip AGI with parental love to supervise us?
-
I think we may well be able to build an AI that experiences parental concern for us, but it won't be the only one.
- 1 more reply
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
Seriously, it's only my wishful thinking, there is no evidence that it would be so. I think the outcomes are more or less unpredictable though.