They were my first introduction to any machine learning techniques, totally on a whim. They're mind-expanding in the way that LISP or calculus are!https://twitter.com/kareem_carr/status/1104422318019887105 …
-
-
Replying to @generativist
The typo is bugging me. I meant *Genetic*! I think they've fallen out of favor because they don't generally use the information from derivatives to find sub-solutions and their composition of sub-solutions is adhoc. Neural nets are more elegant in this regard!
1 reply 0 retweets 2 likes -
Replying to @kareem_carr
I didn't see the typo until right now ;) And yes, they have. But I hope they're still taught widely!
1 reply 0 retweets 1 like -
Replying to @generativist
Also, neural nets enforce sparseness. Each neuron is a sub-solution (some combination of neurons/sub-solutions) from the prior layer and since there is a bound on neurons in the layer, there is a bound on the number of sub-solutions. Less so for genetic algos.
1 reply 0 retweets 1 like -
Replying to @kareem_carr
But it's somehow less fun! (I want to replicate this for pleasure sometime:) https://arxiv.org/pdf/1712.06567.pdf …
1 reply 0 retweets 1 like -
Replying to @generativist
I used to replicate CS papers for fun in undergrad. The most important thing I discovered was a lot of CS papers aren't very replicable. But I hear times have changed ...
1 reply 1 retweet 4 likes
Ugh. I've tried and failed to replicate so many CSS models over the years. It's depressing how often "code available upon request" means "this model is incorrect."
-
-
Replying to @generativist
Sounds like you have the right qualifications for doing Bioinformatics!
0 replies 0 retweets 1 likeThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.