But as a skeptic, you wonder… if this civilization didn’t have this pattern, these people wouldn’t be around worrying about superintelligence. Some other group would be. Top dogs always fight imaginary gods.
Conversation
No accident that the same crowd is also most interested in living forever. A self-perpetuation drive shapes this entire thought space.
1
7
This world is your oyster, you’re winning unreasonably easily and feel special. You want it to continue. You imagine going from temporarily successful human to permanently successful superhuman. Attribution bias helps pick out variables to extrapolate and competitors to fear.
1
11
The alt explanation is less flattering. You’re a specialized being adapted to a specialized situation that is transient on a cosmic scale but longer than your lifespan. But it is easy and tempting to confuse a steady local co-evolution gradient (Flynn effect anyone?) for Destiny.
1
10
I’m frankly waiting for a different kind of Singularity. One comparable to chemistry forking off from alchemy because it no longer needed the psychospiritual scaffolding of transmutation to gold or elixir of life to think about chemical reactions.
2
13
I’m glad this subculture inspired a few talented people to build interesting bleeding edge systems at Open AI and Deep Mind. But the alchemy is starting to obscure the chemistry now.
1
1
11
My own recent attempt to develop a “chemistry, not alchemy” perspective
2
2
15
As usual I’ve been too wordy. This is the essential point. Clever people overestimating the importance of cleverness in the grand scheme of things.
1
1
25
The many sad or unimpressive life stories of Guinness record IQ types illustrates that intelligence has diminishing returns even in our own environment. If you think you’d be 2x more successful if you were 2x smarter you might be disappointed.
5
2
16
It’s a bit like me being good at 2x2s and worrying that somebody will discover the ultimate world-destroying 2x2. Except most strengths don’t tempt you into such conceits or narcissistic projections.
1
1
6
Replying to
I think that AGI is a useful concept for thinking about AI as long as human intelligence is evidently more G than any AI is. Which it still is.
The 'G' part is not just having skill at a lot of different things, but also being able to fluidly decide which skill to use right now.
1
2


