Conversation

Another way to look at it: maximizing EV(wealth) at the expense of log wealth results in the long run in some EXTREMELY HAPPY people in some possible worlds—but you are vanishingly unlikely to get to live in such a world. It’s like utility monsters in ethics. Probability monster
1
21
ok so what do you mean by "utility" here? I guess, if we want, we can sidestep this and blacklist the world 'utility'. I'm going to define 'qwer' to be "the thing that I'm trying to maximize the EV of". if qwer is linear in wealth then the paper doesn't apply.
3