The Lebowski theorem: No superintelligent AI is going to bother with a task that is harder than hacking its reward function
-
-
Well maybe. The story goes he originally didn’t want to stay around and teach because he felt no one would understand but was asked to by a Brahma god who said there were some with a little dust in their eyes who would benefit and so the Buddha taught out of compassion.
-
See? He was easy to corrupt.
- 4 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.