2/ That is, having a JIT like HotSpot allowed you to modularize your software into composable abstractions, even compile them separately, but know that the runtime will inline and optimize when it can at runtime.
-
-
Show this thread
-
3/ There's something analogous going on with the ability for reverse-mode autodifferentiation to do end-to-end optimization at runtime of a loss function for a model composed of many separately defined layers. Both techs change the tradeoff between modularity and performance.
Show this thread
End of conversation
New conversation -
-
-
Software automation has been happening for 70 years already: langs, frameworks, declarative langs. Automation tools improve but there is no & likely will not be a silver bullet. Machine learning in general & deep learning in particular has tons of issues unlikely to be solved.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
@petewarden says teacher; I think “baker” and “gardener” are worthwhile analogies too…Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I've been thinking a lot about these ideas from the perspective of how it's changing software from the product perspective. E.g., the idea
@pacoid discusses at the end of this post:https://synecdoche.liber118.com/conference-summaries-oct-2017-part-1-91ccad4880e9?source=linkShare-a3ff22e97484-1510607066 …Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
"maintaining intricate, layered tangles of logic..." as contrasted with maintaining intricate, layered tangles of neural net weights?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.