"this is the least interesting thing to ever happen in AI" The guantlet is laid down!!
-
-
hmm, looking into those, only recurrent seems at all surprising to me, and that only because I can see that 'only forward' simplifies what might otherwise become a mess-o-perceptrons
-
recurrence smacks of synthetic a priori knowledge but who knows, maybe Kant wasn't such an asshole after all NARRATOR: he was
-
as I read it recurrence is just letting the net feed back into itself, or am I missing something there?
-
that's the 301 level meaning of it, yeah, i've been wondering if people raving about their wonders are talking about anything beyond that or not
-
seems to me that, so long as we can't quite resolve that issue, they're on the gravy train for life!
End of conversation
New conversation -
-
-
I can't quite make out what convolution nets are really, the description on WP reads to me like... just neural nets.
-
idk, some kind of choose-your-own-perceptron shit i guess
-
hmm... 'choose your own adventure' books whose players are, unknowingly, nodes in a distributed chinese room.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.