The alt explanation is less flattering. You’re a specialized being adapted to a specialized situation that is transient on a cosmic scale but longer than your lifespan. But it is easy and tempting to confuse a steady local co-evolution gradient (Flynn effect anyone?) for Destiny.
Conversation
I’m frankly waiting for a different kind of Singularity. One comparable to chemistry forking off from alchemy because it no longer needed the psychospiritual scaffolding of transmutation to gold or elixir of life to think about chemical reactions.
2
13
I’m glad this subculture inspired a few talented people to build interesting bleeding edge systems at Open AI and Deep Mind. But the alchemy is starting to obscure the chemistry now.
1
1
11
My own recent attempt to develop a “chemistry, not alchemy” perspective
2
2
15
As usual I’ve been too wordy. This is the essential point. Clever people overestimating the importance of cleverness in the grand scheme of things.
1
1
25
The many sad or unimpressive life stories of Guinness record IQ types illustrates that intelligence has diminishing returns even in our own environment. If you think you’d be 2x more successful if you were 2x smarter you might be disappointed.
5
2
16
It’s a bit like me being good at 2x2s and worrying that somebody will discover the ultimate world-destroying 2x2. Except most strengths don’t tempt you into such conceits or narcissistic projections.
1
1
6
Intelligence, physical strength, adversarial cunning, and beauty are among the few that do tempt people this way. Because they are totalizing aesthetic lenses on the world. When you have one of these hammers in your hand, everything looks like a nail.
1
1
18
Replying to
I think that AGI is a useful concept for thinking about AI as long as human intelligence is evidently more G than any AI is. Which it still is.
The 'G' part is not just having skill at a lot of different things, but also being able to fluidly decide which skill to use right now.
1
2
Humans kick the ass of every single AI in being G, in this sense. Truly humans are OG's
This is why "AGI" is an aspiration for AI research
Once AIs are comparably G, then a less interesting distinction, since then it will be a case of better at some things, worse at others
1
1
Show replies
Replying to
Leave aside congratulations and valorization. There are capabilities that it would be extremely *useful* for an AI or robot to have, and a lot of those seem to require more G than we can figure out how to program into our AIs.
E.G. the robot butler I can't yet have.





