I almost tweeted this last night, but putting AI on a list of failed predictions seems not-even-wrong to me. blog.samaltman.com/technology-pre
Conversation
Replying to
When people said things like "heaver than air flight is impossible" that was negation of a well-posed, falsifiable proposition.
2
3
Replying to
So treating anti-Scary-AI positions as equivalent to denying heavier-than-air flight is a category error. You can't negate an ill-posed idea
3
2
1
Replying to
The right list on which to put "Scary AI" is a list of failed apocalyptic doomsday cult predictions, not failed tech predictions.
3
3
9
Replying to
Read part of Greer's "Apocalypse Not" the other day: good catalog of that type of error, w/ continuity medieval to 2012.
1
2
Replying to
That's Matt Ridley, Greer is on the other side of such things I believe.He is skeptical of apocalyptic stuff for other reasons
2
1
Replying to
Oh, just saw Ridley's Wired article now. I meant same-title book, ISBN 978-1-936740-00-0
1
1
Replying to
I otoh only read Greer blog post so didn't realize there was a book and namespace collision

