My own recent attempt to develop a “chemistry, not alchemy” perspective
Conversation
As usual I’ve been too wordy. This is the essential point. Clever people overestimating the importance of cleverness in the grand scheme of things.
1
1
25
The many sad or unimpressive life stories of Guinness record IQ types illustrates that intelligence has diminishing returns even in our own environment. If you think you’d be 2x more successful if you were 2x smarter you might be disappointed.
5
2
16
It’s a bit like me being good at 2x2s and worrying that somebody will discover the ultimate world-destroying 2x2. Except most strengths don’t tempt you into such conceits or narcissistic projections.
1
1
6
Intelligence, physical strength, adversarial cunning, and beauty are among the few that do tempt people this way. Because they are totalizing aesthetic lenses on the world. When you have one of these hammers in your hand, everything looks like a nail.
1
1
18
Replying to
I think that AGI is a useful concept for thinking about AI as long as human intelligence is evidently more G than any AI is. Which it still is.
The 'G' part is not just having skill at a lot of different things, but also being able to fluidly decide which skill to use right now.
1
2
Humans kick the ass of every single AI in being G, in this sense. Truly humans are OG's
This is why "AGI" is an aspiration for AI research
Once AIs are comparably G, then a less interesting distinction, since then it will be a case of better at some things, worse at others
1
1
1
1
semantic ape uses Turing complete ape programming language to force other ape minds to emulate doubt
2



