Conversation

In this framework I wonder if humans build systems that are capable of more evil than they are, partially to reap the rewards of evil acts, while maintaining plausible deniability / escape culpability as individuals. We wouldn’t need to be explicit or even aware of this to do it
Quote Tweet
A system can be more evil than the sum of the evil of its human parts. If you don’t account for emergent evil, you’ll end up with a useless morality where you can’t distinguish between people within human range of good/evil at all. It’s like adding a big constant to your y-axis.
5
22
Replying to and
.We do this all the time in relationships where our actions are really just economic variables that we use to negotiate. And while not necessarily with evil intent, they can range from the innocent to the malicious.