so it seems as though god is either a consequentialist or sometimes unjust
-
-
Replying to @GapOfGods
@tipsfromkatee What if the first lie gets 1 unit of punishment, the next gets 1/ω, then 1/ω ² and so on?2 replies 0 retweets 0 likes -
Replying to @InstanceOfClass
@InstanceOfClass (or about making yourself not have lied)1 reply 0 retweets 0 likes -
Replying to @GapOfGods
@tipsfromkatee Suppose the agent has a fixed history like AIXI1 reply 0 retweets 0 likes -
Replying to @InstanceOfClass
@InstanceOfClass what do you mean by fixed history?1 reply 0 retweets 0 likes -
Replying to @GapOfGods
@tipsfromkatee I mean the reward tape is write-once2 replies 0 retweets 0 likes -
Replying to @InstanceOfClass
@InstanceOfClass i guess it's just not the sort of thing the agent reasons about like a bayesian expected utility maximiser?1 reply 0 retweets 0 likes -
Replying to @GapOfGods
@tipsfromkatee Yes, it is not reflective in that way. Like AIXI.1 reply 0 retweets 0 likes -
Replying to @InstanceOfClass
@InstanceOfClass seems pretty unjust to punish it by an expected ω^ω for not being reflective in that way2 replies 0 retweets 0 likes
@tipsfromkatee whatever it takes to create an honest society
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.