If your utility function is such that it isn't affect until then, sure. But it's kind of irrelevant to the point, right?
Sure, but only because human morals may be inherently incoherent. (Otherwise, that isn't possible.)
-
-
It's also an epistemology issue, if calculating likelihood and multiplying leads to Pascal's mugging.
-
I find
@robinhanson's argument for weighting priors for pascal-like problems, based on claims, convincing. - 2 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.