I don't want AI to reason like most human beings and I especially don't want it to reason like Eliezer Yudkowsky or his acolytes. Going to call this the "alignment alignment problem".
2:12 PM - 15 Jun 2021
0 replies
0 retweets
6 likes
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.