the Big News there is fucking laughable "we explored a well-defined search space algorithmically rather than using human habits in it as a baseline" this is the least interesting thing to ever happen in AI
i won't know until i find out whether the ~recurrent nets~ that are all the rage mean anything beyond the 101 level shit they sound like
-
-
well, 301, obviously, they don't exactly cover connectionism in 101s
-
hmm, looking into those, only recurrent seems at all surprising to me, and that only because I can see that 'only forward' simplifies what might otherwise become a mess-o-perceptrons
-
recurrence smacks of synthetic a priori knowledge but who knows, maybe Kant wasn't such an asshole after all NARRATOR: he was
-
as I read it recurrence is just letting the net feed back into itself, or am I missing something there?
-
that's the 301 level meaning of it, yeah, i've been wondering if people raving about their wonders are talking about anything beyond that or not
-
seems to me that, so long as we can't quite resolve that issue, they're on the gravy train for life!
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.