-
-
then we might evolve into patterns of information residing in the AI Singleton at the End of the Universe that being literally the goal Yudkowsky wrote the Sequences to achieve though this might lead to problems if we didn't donate enough of course
-
I don't need to defend every silly utterance Eliezer has ever written to gain value from writers in the rationalist community. I don't even need to respect Eliezer.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.