the Big News there is fucking laughable "we explored a well-defined search space algorithmically rather than using human habits in it as a baseline" this is the least interesting thing to ever happen in AI
-
-
well, go on then?
-
these "recurrent" and "convolutional" nets seem like they might have some technique-level juice to them
-
i won't know until i find out whether the ~recurrent nets~ that are all the rage mean anything beyond the 101 level shit they sound like
-
well, 301, obviously, they don't exactly cover connectionism in 101s
-
hmm, looking into those, only recurrent seems at all surprising to me, and that only because I can see that 'only forward' simplifies what might otherwise become a mess-o-perceptrons
-
recurrence smacks of synthetic a priori knowledge but who knows, maybe Kant wasn't such an asshole after all NARRATOR: he was
-
as I read it recurrence is just letting the net feed back into itself, or am I missing something there?
-
that's the 301 level meaning of it, yeah, i've been wondering if people raving about their wonders are talking about anything beyond that or not
- 1 more reply
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.