People afraid of rogue AI (not rogue people using AI) don't understand desire. Machines don't want shit, and emulating desire is too hard.
-
-
Replying to @YossiKreinin
Wrong. The problem is autonomous learning and not being able to control what it decides to learn and how.
1 reply 0 retweets 0 likes -
Replying to @Enhex
A program reading data and updating its state? Not scary, & no chance to learn the goal "kill all humans" (a desire we're born with)
1 reply 0 retweets 0 likes -
Replying to @YossiKreinin
It can have physical agency, and decide to learn anything. There's nothing magical about humans, brain systems are computable
3 replies 0 retweets 0 likes -
Replying to @Enhex @YossiKreinin
Real ex: In
@DeepMindAI's Space Invaders the AI learns to kill to achieve its goal. Can learn the same outside of a game2 replies 0 retweets 0 likes -
Replying to @Enhex @DeepMindAI
Killing IRL is a bit harder than in Space Invaders, especially through massive trial & error. How'd that unfold, exactly?
1 reply 0 retweets 0 likes -
Replying to @YossiKreinin
I don't know, but I already showed you an AI can learn that. There are already similar AIs that can handle 3D environments
3 replies 0 retweets 0 likes
Randomize programs and you'll get rogue AI, but not soon. "AI could decide to learn anything" isn't a more plausible mechanism.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.