Despite what it seems, I think ML might be suffering from everyone focusing on the local maxima of backprop-trained affine transforms (CNNs/LSTM/etc). The space of possible approaches is vast, and although this one has turned out to work okay, many alternatives remain possible
-
-
I think science moves faster when many different approaches are competing, not when everyone agrees on what works best.
Show this threadThanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Any suggestions for readings on different paradigms?
-
Bayesian statistics for data analysis and probabilistic programming.
End of conversation
New conversation -
-
-
some of us are doing very diffrent things. im sure many others are as well. your point though is a good one
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Go out and breathe!
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Perhaps the paradigm shift has been so rapid there a key underlying factors missed..
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.