I think if I were to optimise for my values correctly then the vast majority of the time I'm not doing anything that looks like optimising, so that tracks
maybe it’s the interplay between different agents in the system, with their distinct values and needs? Optimizing sounds a lot like forcing your will upon the world, what I’m looking for is more of an alignment