You can handle arbitrarily complex tasks with large parametric models trained with SGD. The problem is that doing it well requires a *dense sampling* of the input/output space you're learning, because the generalization power of these models is extremely weak. That's expensive. https://t.co/Lsc7zlQBFE
-
-
Schematically, intelligence is skill divided by experience (I = S/E). Deep learning enables arbitrarily high skill levels, but requires insanely high amounts of "experience" (data) to achieve these levels, resulting in an extremely low intelligence factor.
Show this thread -
Again, a DL model requires a dense sampling of what it's doing. An intelligent agent (like a human) can do extreme generalization from little data. At this time, no one has any clue how that works. However, it may not necessarily be very complicated. Who knows...
Show this thread
End of conversation
New conversation -
-
-
You have strong opinions against AGI as it is commonly understood but I respect your take because you obviously know what you're talking about.
-
What's commonly understood AGI? Super human performance?
- Show replies
New conversation -
-
-
I think a combination of models is where each is specialised at a specific task, but then also part of a 'collective' of models and contributes towards other compute Jobs.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.