Consider a caricature: 2 bad-faith researchers researching “racial bias in police shootings” with intent to find yes and no answers respectively by any means necessary. Are they really equally risky? When should we demand deeper rigor?
Conversation
There is a theory in complex systems called “normal accidents” about how complex systems like nuclear reactors inevitably (Ie “normally”) suffer failures via multiple failures interacting in unanticipated ways. This means some things are fundamentally more risky.
1
1
27
The author, Charles Perrow, in fact takes the conservative view that sufficiently complex tech like nuclear reactors with sufficiently high negative failure costs should not be used at all. I’m not that radical, but he has a point. And the point applies to “complex thinking” too
1
18
If you’re thinking about sufficiently complex topics full of tricky interactions (“Oh Roe vs. Wade led to crime wave ending 20y later...oh wait no, it was taking the lead out of pipes!”) *you WILL make unexpected normal errors*
2
16
You may moreover be thinking under deep moral hazard of being nowhere near the reactor meltdown zones. In social research this might be: policing, criminal justice, public schooling, nutrition, education, war-making. Entire communities could be deeply screwed by your errors.
1
2
24
And this shouldn’t need saying but apparently does. The more powerful you are, the more extreme care you need to take because your casual speculative tweeting could cascade into ill-considered action a few degrees away. Think longer per tweet the more powerful you are.
1
4
41
I’m a random D-list blogger. If I tweet speculative dumb shit, very little happens, but there’s more potential for damage than with someone with no following. If you’re a famous academic who has the ear of impulsive CEOs more can happen. If you’re president, wars might start.
1
32
If you transpose Perrow’s conclusions about nuclear reactors to social science, you would in fact conclude that some subjects should not be studied at all.
Because the only people with the methodological competence to study it might be under unacceptably high moral hazard.
2
2
26
Replying to
Hm - Should existential risk be studied at all? Is it better to walk blindly into it without thinking, or think erratic half-true thoughts about it first?
1
Replying to
A case could be made that it shouldn't be, but it's not straightforward and does not immediately follow from my narrow argument, since x-risks by definition are the ones that would destroy everybody.
1
1
Soft X-risks are actually the more interesting case, where those with more resources are more likely to be among minority of survivors, but also have more ability to study/predict/act in advance. They're under moral hazard of putting action burden on the less-resourced.
This is arguably what happened with the France yellow vests riots. Wonks in Paris decided climate action was a priority and imposed a tax that hit all equally but was a greater burden on the poorest who'd be most exposed to the risks but weren't involved in the action decisions.

