How well can a machine learn to classify digits if it has just a single example of each?
-
-
Show this thread
-
Or, if we're feeling generous, say 5 or 10 examples of each?
Show this thread
End of conversation
New conversation -
-
-
Interesting. I'm not sure I understood what the first video was showing; I should perhaps watch the second!
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Was it done using conditional GANS, like pix2code? Come to think of it, it'd be amusing if it could be done with CycleGANS!
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
It seems very likely a cGANs would work. A CycleGAN might be more troublesome - it'd be hard to invert the transform. But I'll bet something similar would work.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I think there are some benchmarks of human performances on 'alien' alphabets somewhere in the zero-shot transfer learning (eg Omniglot) and Bayesian program synthesis literature.
-
Any keywords that might help me find them? Tried "human learning alien alphabets bayesian program [/ zero-shot]" and a couple of variants in Scholar, got nothing promising.
- 6 more replies
New conversation -
-
-
Interesting! I would guess humans have learned circles O and line orientation | \ — / before learning digits
-
Yup - there's a section on transfer learning.
End of conversation
New conversation -
-
-
If I were to *draw* a "4" for a child, would she learn it faster than if I were to point at one?
-
Good question - I'll bet the answer is often yes.
- 1 more reply
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.