One way to think of a neural network is as a hashtable where the hashing function is locality-sensitive. It memorizes training inputs & targets, and is capable of successfully querying targets for test inputs that are very close to what it has already seen.
-
-
Thinking of it as interpolation also comes to my mind in that specific frame. Thinking of it as a hashtable would imply that the targets might be fixed value whereas NNets can provide continuous targets, wouldn't it ?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Clearly... BTW, considering abstraction, I was considering the fact that Embeddings like word2vec behave like linear vector space by construction but couldn't find theoretical proof of it... Do you know if there is such theoretical ground beyond empirical evidence?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
And, as always, the question is if our brains are any better than locally-generalized hashtables
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Even though I agree, I wonder if the next frontier hasn't been abstraction and reasoning for a long time. I did some work on that, but most related work were from the 90s, early 2000s, but it drowned because they couldn't do the "hash-table" right at the time, just didn't know it
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Data is the king, but "Machine Induction" is prince...
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Adversarial examples don’t seem to fit into the locality sensitive hashtag idea here
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
"In this light, the "intelligence" of the network comes purely from its training data." Brilliant!
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Here's an example of a binary pattern game with very little training data that NNs just don't handle (and an example of human bias in pattern search) https://github.com/westurner/notebooks/blob/gh-pages/maths/binary-patterns_001.ipynb …
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Don’t generative models go in this direction? Can’t the generator be seen as a “world” model and be used as some kind of simulator to sample as many samples as you like?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.