Conversation

Replying to
(More technically, every reduction to a size band between 0 (riced) and 1 (whole cauliflower) either introduces new “fake” primitives like weird “stem junction cubes” that are artifacts of decomposition technique, or violates a lower size bound).
1
6
Because the fox “knows many things” (the florets in their analogical variety, across sizes and parametric shape variations), *any* chopping is too destructive and they freeze into inaction. This is one failure mode induced by fractal realities.
1
6
Because the hedgehog “knows one big thing” that won’t tolerate fractal variety, they destroy the whole thing and tame it with brute force. This is another failure mode induced by fractal realities. Both are driven by fear of incompleteness of existing knowledge.
1
7
In both cases the solution is to accept that *any* action will change reality in a way that makes your previous knowledge incomplete. So you either have to destroy the reality entirely or accept that creative destructive action creates bits that don’t fit what you know.
1
13
Basically the only error-free map is the entire territory. Hedgehogs forcefully tame it into “rice”, foxes leave it untouched, representing the “floret gestalt”. Destroy the territory through reductionism, or treat it like a holistic map with no agency for you.
2
13
Non-fractal realities don’t have this property. For example, animals are not fractal. So a simpler level-by-level deconstruction works. This is Lao Tze’s butcher, taking reality apart at joints elegantly. Disassembly without fractal chopping. Our normal thinking is like this.
1
6
Under normal conditions, we tend to use “stack” thinking: a set of single-level, mutually exclusive, collectively exhaustive (MECE) mental models. We ignore fractal error. We move abstraction levels as necessary, trusting state to stay well-behaved.
1
4
Stack structures can be navigated with lightweight stacks in the computing sense. This is because they follow a finite ontogeny recipe. But large-scale *growing* realities tend to be fractal in macro structure. When you try to navigate them with stack logic, things fall apart.
1
3
My gloss on that is: under normal conditions stack thinking in a fractal world causes fractal map-territory errors that self-correct via foxes and hedgehogs serving as checks and balances on each other. That’s complex systems homeostasis.
2
4
But in weird conditions, they compound fractally, collapsing at all levels. The widening-gyre effect. Hedgehogs become part of problem by getting destructive. Foxes give up in frozen inaction. The system gets ungovernable. Self-correction homeostasis breaks down, all unravels.
Replying to
The solution is to “think entangled, act spooky” as I recently argued. To arrest and reverse a fractal collapse you have to think in fractal-native scale-free ways. How do you do that in practice? I don’t know yet. Working on it.
5
19