I wonder what happens if you train a RL agent with some simple visual system on this task? Might it not be that we get a form of generalisation that produces a "numerosity" sense that signals strongly outside the training range? Might be a good weekend project to test...
-
-
-
I would strongly doubt positive results if you train it without a specific output of 0 (even if you explicitly add a 0 label with no examples) or without an architecture specifically designed to count and trained jointly for these dots, with all size variations.
- 1 more reply
New conversation -
-
-
I would phrase it rather that bees understand the concept of nothing, not “0” per se.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Hmmm ... I’m not sure I understand the concept of zero.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Gary -- Is that even possible? Humans with their big brains, understood Zero only as recently as 300 A.D.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.