I almost tweeted this last night, but putting AI on a list of failed predictions seems not-even-wrong to me. blog.samaltman.com/technology-pre
Conversation
Replying to
When people said things like "heaver than air flight is impossible" that was negation of a well-posed, falsifiable proposition.
2
3
Replying to
So treating anti-Scary-AI positions as equivalent to denying heavier-than-air flight is a category error. You can't negate an ill-posed idea
3
2
1
Replying to
The right list on which to put "Scary AI" is a list of failed apocalyptic doomsday cult predictions, not failed tech predictions.
Replying to
Read part of Greer's "Apocalypse Not" the other day: good catalog of that type of error, w/ continuity medieval to 2012.
1
2
Replying to
That's Matt Ridley, Greer is on the other side of such things I believe.He is skeptical of apocalyptic stuff for other reasons
2
1
Show replies
Replying to
for now anyway. Plus as in his recent NYT piece. We don't even know what real AI looks like mobile.nytimes.com/blogs/opiniona
1
Replying to
1
1
Show replies
Replying to
Which list is less important than what it would mean if it succeeded. There would be no more lists.
1



