A system can be more evil than the sum of the evil of its human parts. If you don’t account for emergent evil, you’ll end up with a useless morality where you can’t distinguish between people within human range of good/evil at all. It’s like adding a big constant to your y-axis.
Of course, if intent is an illusion; a by-product in a deterministic universe, then unintended consequences can I suppose be proper evil. Interestingly, determinism shows up (covertly) in the “just following orders” defense against war crimes. “These events were bigger than me”