Conversation

The nominal materialism of strong AI types obscures a backdoor dualism introduced through semantic games. Because "general intelligence" is a sneaky ineffable "engineering feature" of matter rather than an honest ontic primitive like "mind" or "soul" we're supposed to buy in
1
20
If you take away mystical claims of "generality" (a motte-and-bailey move away from more technical notions of generalizability), you have a powerful machine that can do many things well, and sometimes exploit cross-problem mathematizable regularities
1
12
It's more than a swiss army knife, and turing completeness is a genuine categorical level up, but "intelligence" is a terrible way to frame that level up
1
15
For a while, it looked like "machine learning" (a far superior term) was going to take over, but "AI" is making a comeback, with "AGI" riding its coat-tails. I prefer karpathy's "software 2.0" which has its own problems, but they are less problematic.
1
7
Replying to
Machine learning made sense when every model was tasked with a narrow problem to solve. Once the problems get more and more general it's a poorer fit. We now have models where bounding the generality of what they can do is nearly impossible. What do you call that?
2