Something deeply silly about this “we don’t know how aerodynamic lift works” article doing the rounds. Mystery is not in the phenomenon but in people’s weird expectations of what “explanations” ought to be able to accomplish. Cc
Conversation
Replying to
New career: for a fee I will write a pop-sci article "Here's why we don't understand X", where X is any generic well-understood phenomenon that you want to clickbait people about.
Will reduce everything to the cosmological constant problem if necessary.
5
3
30
Replying to
You're on... come back to ribbonfarm and write a regular column along those lines :D
1
4
Replying to
That would actually be a really fun column to write.
Like an advice column, except people write in with questions like:
"Dear Brian,
Can you tell me why we don't understand the internal combustion engine?
-Running on fumes"
1
1
8
Replying to
yeah, I think it could rise above cynical calling out of the bad explainer culture by using the examples to actually think about the structure of scientific "explanations" what we should actually expect of models etc.
2
1
1
The thing is that lay expectations of "explanations" is some sort of ontologically absolute, finite and closed causality/agency account of *everything* about X. They want religious magic explanations not a sort of moving target partial paradigm with predictive power thing
1
2
10
Replying to
Yeah, I think that for this reason it could be fun to write a regular column that specifically looks at extremely well-understood things, and shows how they are not "understood" in that sense. That the very concept of "understand" is relative.
4
2
9
(Wow, somehow I turned into so gradually that I hardly noticed. And I could have sworn ~5 years ago that he was impossible to understand.)
2
1
11
I think that's mainly because he's trying to do 3 things with 1 theory (attack naive rationality with a more modern epistemology, propose a post-buddhist metaphysics of meaning/nihilism, and provide a social account of meaning-making)...
1
7
So there's way more conceptual machinery than if you wanted to do only 1 of the 3 things... I suspect he's trying the grand unification because it would be an alt foundation for thinking about AI if it works out, but the integration is not needed for more limited purposes
Not my conscious intention, but who knows what confusions lurk in the depths of my psyche?
1
3
If you "attack" rationality without explaining another way of making sense of reality, you'll just come across as a proponent of woo.
The latter two seem obviously related.



