Some failures are containable. If you schedule an hour to debug a program and fail, you can say “I’ll take another shot at this next week” and go on to other things. But it’s hard even when it’s possible. Failure drains energy and you want to warn it back immediately.
-
Show this thread
-
If you continue debugging for another hour, you don’t lose the sunk cost of getting situation awareness for that coding session. If you kick it to next week you can’t just pick up where you left off. You have to reboot. Pay the situation awareness cost again.
1 reply 0 retweets 12 likesShow this thread -
So regulating uncertainty is about containing failure forks. Adding slack helps with this. Parallel uncertainty is worse but rare in personal decision-making.
3 replies 0 retweets 7 likesShow this thread -
Hmm. Alt formulation: “keep the number of forks/futures you have to consider in time period T non-trivially below n” The branching factor of a time interval
2 replies 0 retweets 7 likesShow this thread -
One way to manage uncertainty is to expend a lot of energy to control all sources of uncertainty in the environment. This is the rapid monopoly growth strategy of unicorn companies. Commodity your complements, acquire all competition, buy up supply and distribution.
1 reply 0 retweets 5 likesShow this thread -
For individuals, this translates to get rich/fu$. Money is energy in modernity. Two more accessible (and in many ways more interesting) strategies that can work with limited budgets are robustness (simplify/minimalize life) and adaptability (reorient OODA loop faster)
1 reply 1 retweet 11 likesShow this thread -
Robustness or antifragility (distinction is irrelevant here) both attack uncertainty potential of modernity by striving for locally Lindy simplicity. Waldenponding and other defensive postures are uncritical special cases of this. All sacrifice performance for robustness.
3 replies 1 retweet 9 likesShow this thread -
Adaptability is the most interesting. You just try to think harder and faster. Get inside the OODA loop of the environment to pwn it rather than either dominate it or retreat from it. Those are the 3 broad strategies for uncertainty regulation: spend more, do less, think harder
3 replies 3 retweets 16 likesShow this thread -
Each strategy has merits, but only the last one can make life steadily more interesting and continue the infinite game. The first two define winnable games and try to win them. If you fail, you’re destroyed. If you succeed, you’re in an arrested development cul de sac.
2 replies 1 retweet 12 likesShow this thread -
Replying to @vgr
Put in other words, the first two are convergent and the latter is divergent?
1 reply 0 retweets 1 like
That’s a good way to think of it
-
-
Replying to @vgr
Reminds me of dune: there is only death in the convergence of the vision of the future. Uncertainty is a requirement for survival. The only certainty is death.
1 reply 0 retweets 1 like -
Replying to @rafathebuilder @vgr
To be transparent, that is a core principle I live by. If uncertainty is low, I increase entropy by making large changes, for example network changes.
2 replies 0 retweets 1 like - Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.