Conversation

Been sleeping on this thought. I’m surprised by how much people are willing to indulge SBF’s weird EA ideology/funky math ideas as some sort of meaningful mitigating factor that moves him from scammer bucket to good/well-intentioned. To me it puts him into the terrorist bucket.
26
539
When fedayeen do suicide bombings you don’t say “oh look, this is how this decision works out by the non-Kelly-criterion logic of ISIS” You don’t look at the legitimacy of the actions within the terrorist’s moral calculus. You apply your own.
9
176
Building on this thought, EA is strongly adjacent not just to one but *two* of the major technologies of the day: crypto and AI, and it brings a similar strain of religious thinking to each.
1
52
Within the EA sphere of influence (which is a small but non-trivial subset of each), the two technologies are less like libertarian vs. communist, and more like sunni vs. shia. Or protestant vs. catholic. Yesterday, I heard FTXgate described as the "crypto 9/11"...
2
33
I was chatting with about what an AI version of this kind of terror attack might look like and I was struggling to come up with plausible scenarios, but now one occurs to me: somebody who takes AI risk/alignment bs too seriously and physically attack AI infrastructure
4
45
Once any religion becomes large enough to develop an extremist fringe, you have to take the risk seriously. I can see some nutjob hopped up on a weird "nootropic stack" bombing an AI datacenter or worse. It's the blow-up-Skynet plot of Terminator, except there is no Skynet.
3
34
sadly, looks that way... I was initially leaning towards him being a conscious sociopath scammer like madoff, but it increasingly looks like he was serious (in a bullshit-but-not-really way) about all the weird extreme utilitarianism in the multiverse shit
Quote Tweet
Replying to @vgr
so yet another example of esoteric internet culture radicalizing an extremely online young man?
Replying to
It would be hugely ironic and tragic if the biggest risk of AI were in fact people paranoid about "unfriendly AI" doing decidedly unfriendly things to currently living humans who don't share their paranoia about future metaverse torture and such.
2
64
Yep... this is true across the whole broad memeplex, not just the explicitly EAish-longtermist crowd.
Quote Tweet
Replying to @vgr
+1 If you dig into the ideology of longtermism (into which most of EA mutated) you'll unveil psychopath ‘ethics’ with a nasty messiah complex. People who feel primarily accountable to hallucinated billions of far-future humans stop caring for the ones currently living.
1
23
If you're seriously invested in/work in AI, crypto, or more wholesome flavors of long-termism like long now, you're now in the position of all the normie muslims being constantly asked to explain/justify/apologize for what fringe terrorists do
3
43
The 9/11 comparison is apt in another way... as someone who has been invested in the crypto scene (though not very visibly until recently), I'm starting to get what normie muslims endured in the wake of 9/11
1
14
You've probably seen some of the take-no-prisoners burn-all-crypto down kinds of hostile takes Ecoterrorism and animal-rights terrorism are recent non-trad-religious examples with similar dynamics
3
17
Show more replies