the Big News there is fucking laughable "we explored a well-defined search space algorithmically rather than using human habits in it as a baseline" this is the least interesting thing to ever happen in AI
but graph search is really powerful technology so why not, too bad about all the damn memory reallocation
-
-
I mean, is it really radical to say "we've got a machine that can look more levels into chess/go because we've got faster hardware and more memory to hold a search tree"?
-
oh totally not, but that isn't all that's happening
-
well, go on then?
-
these "recurrent" and "convolutional" nets seem like they might have some technique-level juice to them
-
i won't know until i find out whether the ~recurrent nets~ that are all the rage mean anything beyond the 101 level shit they sound like
-
well, 301, obviously, they don't exactly cover connectionism in 101s
-
hmm, looking into those, only recurrent seems at all surprising to me, and that only because I can see that 'only forward' simplifies what might otherwise become a mess-o-perceptrons
-
recurrence smacks of synthetic a priori knowledge but who knows, maybe Kant wasn't such an asshole after all NARRATOR: he was
- 3 more replies
New conversation -
-
-
well, yeah, but that's to progress in 'AI', that's progress in memory and CPU throughput
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.