We live in a Society.
-
-
-
sort of
End of conversation
New conversation -
-
-
when(if) it(she, he) has self-awareness and intentions
-
my current notion of an AGI implies that it has at least the same intelligence as the thing that created it, i.e. at least a little more than ours, and it's probably scalable
End of conversation
New conversation -
-
-
How do you regard Coherent Extrapolated Volition, Joscha?
-
There is no alternative. If one side does not share Coherent Extrapolated Volition, the Coherent side will inevitably be forced to fight if the expected benefits exceed the cost.
- 3 more replies
New conversation -
-
This Tweet is unavailable.
-
-
AGI will be my equal, and will be my companion if we have complementary functions. Just like natural general intelligent beings could potentially be my collaborator if we need eachother's skills and interests.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
There might be room for constrained self-determination, which is close to a terminal value. This is a strong theme in Iain Banks model of how Minds interact with humans
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
It's hard for me to see how you could, from first principles, figure out what it means to be 'best for you'.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.