@Meaningness @asilentsky If one is really committed to MWI, shouldn't one disregard extinction event risks?
@MemberOfSpecies If that *was* clear, I’m not sure how what I attributed to singularitarianism differs from the actual argument.
-
-
@MemberOfSpecies (Does anyone ever use actually infinite utilities? That would be formally interesting…) -
@MemberOfSpecies The problem with “U×P, U huge, P not super small” is that it could be anything, so no meaningful way to use decision theory - 8 more replies
New conversation -
-
-
@Meaningness The difference is in the ε, which I take to refer only to probabilities so small they make max-EU heuristics problematic. -
@MemberOfSpecies The problem is more that U and P are both so uncertain that UxP could be anything; was subject of y aborted LW sequence. - 3 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.