The 1st button creates a superpowerful AI that is perfectly aligned with ur best self; the you/ur desires when you're super self aware, with a ton of time to think about things. The 2nd button creates an AI aligned with the combined/avg best self of everyone on earth. You push:
Conversation
Replying to
If I do not get to set up literally any checks against the result of pushing the 2nd button, I push the 1st button and hope the resulting AI pushes the 2nd button. It's a serious moral risk, but the not-so-moral risk of the 2nd option seems worse if literally no checks.
6
1
70
What’s the risk? If the resulting AI doesn’t push the second button, doesn’t that mean you didn’t actually want it pushed?
2
Replying to
for a certain sense of "best self" there is no different between the two options
1
11
Replying to
Me because I think too many people are not utilitarians and non-utilitarian AI could be very dangerous
2
2









