This is a sim. Omega has 1000s of years to convince you.
-
-
Replying to @MoralOfStory @ContentOfMedia and
The first two require far less computational power than simming 3^^^3 people. You need to also comprehend the hardware which will be unimaginably huge. You need to already know the output of the program.
1 reply 0 retweets 0 likes -
Replying to @ReferentOfSelf @MoralOfStory and
And there is no way you can know what you-after-1000-years-of-convincing is like or what they would choose.
1 reply 0 retweets 0 likes -
Replying to @ReferentOfSelf @ContentOfMedia and
The you-after-1000-years says to a forked copy of you, "I have exhaustively verified this computational machinery. Choose wisely."
1 reply 0 retweets 0 likes -
Replying to @MoralOfStory @ReferentOfSelf and
(Omega resets both copies & repeats this process a gajillion times until they find the combination of words which creates in you the correct state of belief)
2 replies 0 retweets 0 likes -
Replying to @MoralOfStory @ContentOfMedia and
What does that get you? Yes, we can hypothesize all sort of you-like things which come to believe arbitrary things. But you aren't them; you don't know what it is like to be them.
1 reply 0 retweets 0 likes -
Replying to @ReferentOfSelf @ContentOfMedia and
Yes, you can't access that epistemic state and nearly everyone who thinks they can is confabulating (that goes for much more mundane thought experiments, too). But is it invalid to reflect on what you'd want that you to choose?
1 reply 0 retweets 0 likes -
Replying to @MoralOfStory @ContentOfMedia and
It is not invalid to reflect on what you think you might choose, but understanding that your action will affect quantityOf(3^^^3) people seems like such an alien state of mind that you can't have a good idea what you might choose
1 reply 0 retweets 1 like -
Replying to @ReferentOfSelf @ContentOfMedia and
I think nearly all responses to "what would you do when faced with X" questions are confabulation and I don't think this case is especially worse than most.
2 replies 0 retweets 0 likes -
Replying to @MoralOfStory @ContentOfMedia and
My entire point is that it is so much worse than most. For the ordinary trolley problem, you can easily imagine what it's like to flip the switch and for that to kill 5 people. You can't for the fat man (too messy for certainty). It is physically impossible for dust specks.
1 reply 0 retweets 0 likes
Yeah I don't think people are actually creating the desired internal state when they make predictions about what they will do in that scenario, either.
-
-
Replying to @MoralOfStory @ContentOfMedia and
The difference is that the ordinary trolley problem scenario is intelligible. Understandable. Is a way in which you can encounter the world.
0 replies 0 retweets 1 likeThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.