I forgot that most people are not so surprisingly unafraid of AI because they somehow think it is not dangerous to share the planet with a species that is more intelligent than your own, but because they still cannot imagine that it will happen.
From the perspective of every other primate species, human intelligence is an uncontrollable weapon of mass destruction. Under which conditions should we treat the research efforts into superhuman artificial intelligence as careless experimentation with an extinction risk?
-
-
Ok, but I'm not a primate species, I'm a primate, as are you. From my perspective, almost the entire world is out of my control. The question is: what does 1000x human intelligence imply? More "can dominate humanity 1000x over" or "can substitute for 1000x humans"?
-
Our intelligence does not scale well, because brains cannot grow much larger, childhoods (= initial training periods) cannot be much extended, and communication between minds is limited. There is no obvious comparable limit for an electronic brain. One may outsmart all of us.
- 5 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.