I think that general intelligence is the attempt and the result of determining your true relationship to the universe. This is also why I doubt that "AGI alignment" is an issue that can be solved or terminally constrained by humans for the AGI.
-
-
Sure as long as you stay open to the concept that there are other existing models to be explored , sometimes it might feel like you have to let go of rationality or reasoning in order to see or feel new models. Unless you want to stay stuck . I’ve been running in circles in mine
-
And I’m sure my models are very different from yours . As are yours from mine.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.