"It's not that I haven't made any friends, per se. It's just that I haven't yet solved the Human Alignment Problem."
Conversation
The Artificial Intelligence "Alignment Problem" (fear that powerful AI will not have the same goals as humans) is just Dark Forest Theory in time instead of space.
2
1
16
The AI Alignment Problem is based on assumptions that "intelligence" is hierarchical, and that more intelligence automatically means more motivation for expansion, replication, and extraction. All flawed.
1
17
Just again a bunch of nerds who are scared that someone more powerful than them is going to show up and take all of their stuff. AI Alignment fear and Dark Forest Theory are absolute projection.
People who are scared about advanced AI taking over and using all available computational resources are the same people who gave us the Adobe Suite, lol.
Reminder that almost all of AI alignment theory was created by a bunch of sociopaths projecting their own blunted understanding of intelligence onto some future robot frenemy.
Show replies

