What happened?
-
-
-
Eliezer convinced Neil deGrasse Tyson that you can’t keep superintelligent AI in a box. Here he is saying so:https://www.youtube.com/watch?v=gb4SshJ5WOY&feature=youtu.be&t=1h59m45s …
-
What time in the video is the relevant segment?
-
1:59:45
- 1 more reply
New conversation -
-
-
Hear Hear! Neil my respect for you has grown because I was so confused how such a smart person could view AGI as you did. Eliezer, to be fair, yours is a tough name to get right! But kudos to you sir for using well explained reason to change hearts and minds.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
It's off topic but I just wanted you to see the meme about Newcomb's paradox, Eliezer. Sorry.pic.twitter.com/8YuzDcNKQH
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Was it money? Did you offer the people a relatively large amount of money compared to the $10 or $20 they would get if they didn't let you out of the box? If you offered me a relatively large amount of money, I'd let you out of the box.
-
But isnt that different than dealing w a box containing a super AI that u know would wreak havoc if it got out?Wat negative consequence were these gatekeepers up against 2 incentivize keeping u in the box?These gatekeepers don't seem analogous 2Gatekeepers of a dangerous,boxed AI
-
Is it that the gatekeepers wouldn't know the AI was dangerous,specifically because the AI knows how to seem harmless to humans? If so, then this is different than being gatekeeper of an AI that the gatekeeper knows is dangerous.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.