A ladder of assertions about abilities of your rival X
1. X can’t do it
2. X can do it, but not as well as us
3. X can do it as well, maybe better, but not innovate
4. X can innovate, but only incremental, not bold leaps
5. X can bold-leap but no moral compass so will fail
Conversation
The interesting is that X = China and X = AI are basically identical arguments. By the time you get to Stage 5, you’re essentially arguing “the gods are on our side”.
There’s a Stage 6 after you’ve already lost:
6. The indomitable _________ spirit
Blank = [human, western] etc
4
32
Americans were shocked commies could figure out nukes. Liberal democracy not necessary.
Europeans were shocked when China, India went nuclear. White+Christian culture/beliefs not necessary.
Anti-Islamist world shocked when Iran and Pakistan did...”good” religion not necessary.
1
10
Examine the actual necessity of elements of the path you took to get somewhere. There are many roads to the top of many important mountains. And the nukes work like nukes no matter how you got to them.
Facts don’t care about the “right” agents using them for the “right” reasons.
1
17
Big blind spot of west is complacent assumption that many path-dependent elements of western journey to postmodernity are necessary when they are in fact irrelevant artifacts of overfitting path noise. They have less to do with getting to X and beyond than natives imagine.
1
29
Ironically white supremacists seem to recognize this most clearly. The battle cry of “we will not be replaced” is acknowledgement of the reality that they themselves are no more necessary or essential to the story than the “necessary” elements of their history.
Replying to
You know what’s hilarious about humanist fears of AI? It exactly parallels the “we will not be replaced” line of wishful thinking.
*Humans* may not be necessary/essential to whatever story is actually about (Strong AI people are attached to a particular idea of “actual story”)
1
13
Both are what I call self-essentializing world-views.
‘Essentialism” is usually an adjective applied to analysis of others based on a mix of contempt and fundamental attribution error.
Self-essentializing is based on self-congratulation and fundamental attribution.
1
9
We think of fundamental attribution errors mainly with reference to failure:
“You failed because you are a flawed character”
“I failed because of such and such circumstances”
But with success we flip the script:
“I won because of character strengths”
“you won by luck.”
2
10
Self-essentializing gets you to inflated “big” identities for yourself, deflated “small” identities for others. In extreme case, your identity explains all reality and others seem like NPCs. A sort of perverse “I am the universe” false enlightenment sense of unstoppable agency.
1
11
This self-essentializing can only develop in leaders while they’re on top, and for long enough.
In the past, other cultures have gone through such hubris relative to both other humans and improving tech.
It is hard to evolve past “peak identity”, the point of greatest success
1
6
Often identity cannot evolve past the historical peak because to do so it would have to deflate and reinflate. Often through a full death-and-resurrection moment of individual and/or collective ego death.
1
9
I like to think being an immigrant twice over has helped me get rid of both kinds of blindspots (cultural self-essentialist and humanist).
India —> US
STEM interests —> liberal arts/humanities interests
A tweet or two about each...
1
10
I was never at risk of self-essentializing a “big” Indian identity. India in the 80s when I was growing up was a sad joke. Big-identity Indians were (and largely still are) insecure buffoons with entire identity built on a somewhat overstated claim to having invented the zero.
1
13
(exaggerating a bit for effect there...)
I was briefly at risk of becoming fully “westernized” and becoming a fully gone-native western thinker with a wog epistemology.
I do think 90% in English, but I’d say only ~30-50% of my thought patterns are classical western ones).
1
8
I was *very* briefly at risk of becoming a STEM supremacist (for about 3 months in the summer of 1993, between getting accepted at IIT, a big hubris booster, and discovering upon getting there how ordinary my STEM talents were in that cohort).
1
10
When I got interested in non-STEM topics, I was coming off 15y in the STEM world, exiting with 3 mediocre degrees, mediocre track record (including a deadpooled product, 6 meh patents, and a dozen meh publications), and a small but secure identity/confidence. No Elon Musk but ok.
1
4
I was never a STEM supremacist because I didn’t win enough to get an inflated sense of myself that way. But I won enough that it was enough to immunize me against the intimidation and contempt defenses against dirty barbarian STEMmie attention on lofty humanist questions.
1
8
Interestingly there is nothing like wog epistemology (“brown sahib” Indian —> European going native) for STEM —> HSS. I’ve never met a STEMMie so in awe of critical theory that they accept access to HSS discourses with deferential gratitude and abandonment of subversive impulses.
2
4
So, tldr of my 2 immigration stories: I never self-essentialized as Indian or wog-Westerner, or as STEMMie doerist or HSS critical-theory supremacist.
This has been the reward for a spectacularly mediocre 22-year adult career along all conventional vectors of accomplishment.
1
8
The consolation of mediocre success at life’s games around big prizes is an identity too small to obscure your view of where and how your feet are on the ground.
Immigration story #1 makes me incapable of ethnonationalist sentiment on *any* side that might accept me as a member.
2
9
Immigration story #2 makes me incapable of being either a Singularitarian worshipping at the altar of Roko’s basilisk OR a desperate humanist worshipping at the altar of the “indomitable human spirit” and writing really bad takes on AI illustrated by Terminator stock photos.
2
11
But to bring it back full circle (I went from general idea about views of rivals to nation/species level examples down to personal) the point of my original “ladder of assertions” tweet is that “success epistemologies” are “big frozen identity” epistemologies and are unsatisfying
1
9
Everybody I know operating with a success epistemology based on attachment to a big, self-essentialized identity seems to be unhappy about it, even as they strut about performing confidence and satisfaction in how they view the world.
Their world view is more wall than window.
2
12
Underneath the performance: something between frantic denial and deep depression. My reaction to the performance peppered with obvious tells that it is not the felt reality underneath is usually some flavor of “methinks the derper doth protest too much”.
1
9
I have my own dissatisfactions, denials, and depressive responses to life, but thankfully being attached to the ladder of assertions about others abilities is not one of the sources. Which is good because it allows my worldview to be more window than wall, and I like a good view.
1
5
