The problem with #machinelearning in a nutshell, via extensional vs intensional logic:
A: {2, 4, 6, 8, 10, ... X}
B: {2x: x ∈ N}
There is no way for a statistical, asymbolic machine to arrive at B from A, no matter how large you choose X @GaryMarcus @filippie509 #ai
-
-
This isn't a problem with machine learning, but rather with low bias approximators. They fit the data well and can interpolate but have no mathematical reason to extrapolate. A lot of work, including in deep learning, deals with yielding better extrapolation by introducing bias.
3 replies 2 retweets 28 likes -
Agree w
@egrefen; intension/extension issue arises with a particular class of machine learning approaches that is very popular, not intrinsic to ML in principle. This is why I keep lobbying for hybrid models that start w operations over variable.3 replies 0 retweets 7 likes
Replying to @GaryMarcus @egrefen and
This was the central point of chapter 3 of The Algebraic Mind.
4:56 AM - 1 Jun 2018
0 replies
0 retweets
1 like
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.