Ugh major typos above
The philosophical principle is sound at $1000, sketchy at $1m, sketchy at $100m, and relish at $1b.
=
The philosophical principle is sound at $1000, sketchy at $1m, very sketchy at $100m, and religion at $1b.
Conversation
I’ve supported people in making big money decisions but have not myself ever bought anything bigger than a car. That’s borderline between object level and theorized. We’ve been shopping for a house for the first time and it feels clearly like “buying a full-stack theory of life”
2
4
77
I’ve seen singularitarians express an astonishing sort of worry, that “obviously” the highest leverage kind of future-utility-maxxing EA-giving is to AI risk and that seems a little too easy (afaict this is why this crowd loves EA like PB&J)
Really? Ya think?
2
32
Fun math problem of the sort they’re actually geniuses at but never seem to do. If your theory of “Spend X$ on Y” rests on 7 layers of abstraction, and you’re 90% sure your thinking at each level is sound, what are the chances you’ll reach the right conclusion?
0.9^7 = 0.48.
1
6
51
This sort of thing has long been my main critique of wealth inequality. It’s not really a critique of EA in particular, but *any* single theory that an org proportionate in size to log(wealth) must embody to deploy wealth.
1
2
44
Large wealth concentrations produce stupidity at scale, *whatever* the theory and purpose of deployment. The most “effective” thing you can do is fragment it to the point it’s not quite as dumb. Unless the thing itself requires concentration, like a space program.
2
12
84
When people say they want “market-based” solutions to problems instead of “massive” state programs, the underlying intuition is not about markets so much as it’s about maximum scale of deployment an individual or closed org gets to do without orders from “above”
1
4
64
A “market-based” solution which leads to a huge corporation spending $1b government order via internal hierarchical decision-making is actually worse than a $1b government program that’s deployed as 40 $250k grants to smaller agencies. Latter is actually more market-like.
2
6
65
Of course this is not always possible. Not all problems can be partitioned this way. If you want to allocate $1b to a space program, giving 40 cities $250m to start 40 space programs is dumb. The problem requires concentration. But within physics constraints, unbundle the spend.
3
1
34
Heh sorry but ironically illustrates the point of errors creeping in with abstraction. 1b/250k is 4000 not 40. Plus I typoed it elsewhere as 250m (which would be 4)
Quote Tweet
Replying to @vgr
this thread is genuinely great, but in consecutive tweets i think you asserted that 1B/40 = 250m and 250k. Unfortunately neither of those is right
(unless I'm missing that being part of the commentary)
1
1
20
Replying to
I promise if someone gives me 1b to deploy, I’ll use an excel spreadsheet to do the arithmetic properly and hire an intern to crosscheck it it for decimal point and order of magnitude errors
2
1
59

