The problem with #machinelearning in a nutshell, via extensional vs intensional logic:
A: {2, 4, 6, 8, 10, ... X}
B: {2x: x ∈ N}
There is no way for a statistical, asymbolic machine to arrive at B from A, no matter how large you choose X @GaryMarcus @filippie509 #ai
-
-
IIRC your paper points out that such args dilute the meaning of learning. I don't agree with this* but I do agree arguments that evolution learned it are non-sequiturs and completely irrelevant. Humans don't start from blank states; it doesn't matter what installed those biases
-
The question was whether ML can address the problem above. If evolution was abled to learn these biases, why wouldn't ML be able to learn them? In any case, humans have them, and if you rule out that a higher being put them there, some form of learning created them.
- 3 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.