The key thing that David Marr proposed was structured hierarchy in classification. Less easy to get working than the simplified form that became Deep Learning, but ultimately far more powerful and efficient.
-
-
-
Exactly what I wanted to say -
#AI miss him, sadly passed at 35 years of age
End of conversation
New conversation -
-
-
This Tweet is unavailable.
-
I think our best success is going to be in making very good simulations of reality, then teaching our nets on that, since we'll be able to classify everything in the simulation. And thus the map becomes the territory and the AI is in the box.
End of conversation
-
-
-
Brilliant point. This paper shows how NNs perform way worse than humans at classification when the images are degraded. https://arxiv.org/pdf/1705.02498.pdf …
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
This reminded me of a small experience I had when I was preparing a training on the MNIST dataset while having my 3-year-old son next to me: while the NN failed at guessing inverted images, 90 degrees rotated ones, my son guessed every single one of them.
-
By default we store visual representations in a symmetry-invariant way; letter orientation is something we have to learn, typically from ages 5 to 7. It's common for 5 y.o. to confuse d/b, write inverted letters, or even write in full mirror writing.
- Show replies
New conversation -
-
-
This is fundamental. Style transfer is always the right answer.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Is this what Jeff Hawkins is espousing? (Hierarchical Temporal Memory)
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
(Words/opinions of elite engineers like you guys are, every now & then, inspiring.)
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.