I'm thinking of drafting a "𝐇𝐮𝐦𝐚𝐧𝐢𝐭𝐲 𝐔𝐧𝐢𝐭𝐞𝐝 𝐟𝐨𝐫 𝐭𝐡𝐞 𝐒𝐢𝐧𝐠𝐮𝐥𝐚𝐫𝐢𝐭𝐲" contract for people to sign.
It would specify the following deal amongst all humans in the multiverse:
- We will all work together to create a human-positive singularity in which… Show more
Conversation
So you disagree with the tit-for-tat strategy on an iterated prisoner's dilemma?
1
In an iterated game it works even when defection is costly.
1
1
There's game theory folk theorems. Interestingly, folk theorems permit extremely bad equilibria.
Consider a situation where everyone is in hell and has a dial they can turn to turn up the temperature *for everyone*, the temperature of hell is set by the average of the choices...
1
4
Folk theorems state that there's a possible equilibrium where everyone puts the temperature close to the highest value (assume, higher temperature in hell is always bad). Because they're afraid that if they set it lower, others compensate in future rounds by increasing it.
2
3
> Because they're afraid that if they set it lower, others compensate in future rounds by increasing it.
And what do they gain by doing it themselves?
1
2
They set the temperature high because they think if they set it low others set it high in future rounds. Others set it high in future rounds because they think if they don't, others set it even higher in future rounds. This is a Nash equilibrium because of folk theorems.
1
4
See the "individual rationality" condition: en.wikipedia.org/wiki/Folk_theo
What is required for something to be a Nash equilibrium is that no one could do better by min-maxing assuming others pessimize their utility. That minmax value is if you set temperature to min, others max it.



