Does it matter that we only have a retrospective theory of 18mo inference? What if there's less "sense" than we think; e.g. all statistics learning and physiology priors, and the "common" just falls out of a restricted state space? Would that still be a good enough AI aspiration?
-
-
-
my strong intuition is that that will never ever work. we have had a massive multi-billion $ test of that perspective over the last decade with more compute than was available ever before and it really has not led to machines that can understand even basic sequences of events.
- 1 more reply
New conversation -
-
-
And that's what Tenenbaum and his inference over programs or game engines is trying to achieve. Really cool work and not DL.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.