People afraid of rogue AI (not rogue people using AI) don't understand desire. Machines don't want shit, and emulating desire is too hard.
-
-
Replying to @YossiKreinin
Wrong. The problem is autonomous learning and not being able to control what it decides to learn and how.
1 reply 0 retweets 0 likes -
Replying to @Enhex
A program reading data and updating its state? Not scary, & no chance to learn the goal "kill all humans" (a desire we're born with)
1 reply 0 retweets 0 likes -
Replying to @YossiKreinin
It can have physical agency, and decide to learn anything. There's nothing magical about humans, brain systems are computable
3 replies 0 retweets 0 likes
Replying to @Enhex
2. Humans HAD to evolve self-preservation and power-seeking goals to survive. Why would a machine, without evolutionary pressure?
1:45 PM - 1 Oct 2016
0 replies
0 retweets
0 likes
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.