Circular like the anthropic principle. You notice that earth is optimized to sustain human life. Your first thought is, a God created this place just for us. Then you have the more sophisticated thought that if it weren’t Goldilocks optimal we wouldn’t be around to wonder why…
Conversation
But notice that the first thought posits a specific kind of extrapolation — an egocentric extrapolation. “God” is not a random construct but an extrapolation of an egocentric self-image as “cause” of Goldilocks zone.
The second thought makes it unnecessary to posit that.
1
8
Flip it around to be teleological. In this case, a certain class of people do well in a pattern of civilization. If you assume that pattern is eternal, that class of people suggest evolution to an alluring god-like omega point and a worry that machines will get there first.
1
1
7
But as a skeptic, you wonder… if this civilization didn’t have this pattern, these people wouldn’t be around worrying about superintelligence. Some other group would be. Top dogs always fight imaginary gods.
1
1
11
No accident that the same crowd is also most interested in living forever. A self-perpetuation drive shapes this entire thought space.
1
7
This world is your oyster, you’re winning unreasonably easily and feel special. You want it to continue. You imagine going from temporarily successful human to permanently successful superhuman. Attribution bias helps pick out variables to extrapolate and competitors to fear.
1
11
The alt explanation is less flattering. You’re a specialized being adapted to a specialized situation that is transient on a cosmic scale but longer than your lifespan. But it is easy and tempting to confuse a steady local co-evolution gradient (Flynn effect anyone?) for Destiny.
1
10
I’m frankly waiting for a different kind of Singularity. One comparable to chemistry forking off from alchemy because it no longer needed the psychospiritual scaffolding of transmutation to gold or elixir of life to think about chemical reactions.
2
13
I’m glad this subculture inspired a few talented people to build interesting bleeding edge systems at Open AI and Deep Mind. But the alchemy is starting to obscure the chemistry now.
1
1
11
My own recent attempt to develop a “chemistry, not alchemy” perspective
2
2
15
As usual I’ve been too wordy. This is the essential point. Clever people overestimating the importance of cleverness in the grand scheme of things.
Replying to
The many sad or unimpressive life stories of Guinness record IQ types illustrates that intelligence has diminishing returns even in our own environment. If you think you’d be 2x more successful if you were 2x smarter you might be disappointed.
5
2
16
It’s a bit like me being good at 2x2s and worrying that somebody will discover the ultimate world-destroying 2x2. Except most strengths don’t tempt you into such conceits or narcissistic projections.
1
1
6
Intelligence, physical strength, adversarial cunning, and beauty are among the few that do tempt people this way. Because they are totalizing aesthetic lenses on the world. When you have one of these hammers in your hand, everything looks like a nail.
1
1
18

