I meet lots of people who tell me fatalistically (& often despondently) that it's near impossible to do important work on neural nets today, unless you have huge compute and huge data sets.
-
-
Back to neural nets: a danger in scaling up your computational power is that you start to focus _only_ on questions that require that computational power. You hire specialists who thrive in that environment, but who aren't so good at playing with basic, fundamental questions...
Show this thread -
... and your culture starts to tilt that way, driving out people who do like to play with basic, fundamental questions.
Show this thread -
Take all this with a grain of salt. Neural nets are a side interest, not my main interest. Maybe I'm wrong. But I don't think so. This dynamic has played out in genome sequencing, in particle physics, & in many other areas. Big science is attractive, but often small science wins
Show this thread -
Does this mean computational power or big data is useless? No, of course not. There are important questions that can likely only be addressed that way. But if you want to work on AI, it seems to me a mistake to be too focused on the need for lots of data and lots of compute.
Show this thread
End of conversation
New conversation -
-
-
#Creativity loves constraint.#genius ...remember,#Einstein did so much with just a#thought#experiment! http://www.businessinsider.com/5-of-albert-einsteins-thought-experiments-that-revolutionized-science-2016-7 …#science#truth#research#grind#inspirepic.twitter.com/7RZPgH3y1C
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.