I’m always curious about how people turn complex utility functions into simple behavioral heuristics. For example, catching a ball in a sort is arg_max_(running trajectory){P(catch the ball(running trajectory)
The way people solve it is “run to keep ball at constant angle”
Conversation
The solution to “regulate uncertainty within this band” seems to be “don’t make a plan with more than n high-uncertainty actions in sequence”
The act of padding with delays is to break up fragile action sequences.
1
7
So if n=2
“Drive to airport” —> “Get through security” —> “Board plane”
turns into
“Drive to airport” —> “Get through security” —> “Wait around 10-60 minutes” —> “Board plane”
4
There’s actually a set of uncertainty reducing behaviors. One of mine is “don’t drive unless you must,” since I don’t enjoy it much and tend to make time-stress mistakes like missing exits. So taking transit or uber turns high uncertainty action into low.
1
11
Thinking more about this, the reason uncertainty regulation is so important is that its flip side, failure, can have immediate consequences. If you have an accident driving to the airport you have to deal with it. You can’t defer the consequences. Failures are potential forks.
1
9
Some failures are containable. If you schedule an hour to debug a program and fail, you can say “I’ll take another shot at this next week” and go on to other things. But it’s hard even when it’s possible. Failure drains energy and you want to warn it back immediately.
1
11
If you continue debugging for another hour, you don’t lose the sunk cost of getting situation awareness for that coding session. If you kick it to next week you can’t just pick up where you left off. You have to reboot. Pay the situation awareness cost again.
1
12
So regulating uncertainty is about containing failure forks. Adding slack helps with this.
Parallel uncertainty is worse but rare in personal decision-making.
3
7
Hmm. Alt formulation: “keep the number of forks/futures you have to consider in time period T non-trivially below n”
The branching factor of a time interval
2
7
One way to manage uncertainty is to expend a lot of energy to control all sources of uncertainty in the environment. This is the rapid monopoly growth strategy of unicorn companies. Commodity your complements, acquire all competition, buy up supply and distribution.
1
5
Replying to
Robustness or antifragility (distinction is irrelevant here) both attack uncertainty potential of modernity by striving for locally Lindy simplicity.
Waldenponding and other defensive postures are uncritical special cases of this.
All sacrifice performance for robustness.
3
1
8
Adaptability is the most interesting. You just try to think harder and faster. Get inside the OODA loop of the environment to pwn it rather than either dominate it or retreat from it.
Those are the 3 broad strategies for uncertainty regulation: spend more, do less, think harder
3
4
14
Each strategy has merits, but only the last one can make life steadily more interesting and continue the infinite game. The first two define winnable games and try to win them. If you fail, you’re destroyed. If you succeed, you’re in an arrested development cul de sac.
2
1
9
In last one: You view life as an asymmetric guerrilla warfare challenge.
Kissinger’s “The conventional army loses if it does not win. The guerrilla wins if he does not lose” principle.
Big upside of this is that you stay interested and interesting in the world. Not win+exit.
1
1
20
