Since we think in the language of fairy tale, we overestimate threats we can call “evil” (crime, terrorism, tyranny) and underestimate threats we can’t (accidents, viruses, climate change), even though the latter are far more likely to lead to our extinction.
-
-
I think that’s descriptively true, but I also think it’s an adaptive consequence of our prior (and present) inability to localize a hypothetical cause and actually do something about it beyond a certain scale. All else was placed in God’s column, as inexplicable “acts”.
-
But each person is still running the software that knows about non-local reality / processes upon hardware that simply isn’t evolved to care, which is why we attempt to trick that system into caring more about abstract causes, but that’s a dangerous game (i.e. ideology).
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.