2/ Add your current ML model near the end of your docker script. Add your data at the end (if feasible).
-
-
-
3/ Why? You get fully-contained, tagged models.
End of conversation
New conversation -
-
-
7/ I don't care about deployment much. All my models are local or briefly on EC2.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
8/ But, I *really* care about reproducibility for understanding model progress.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
4/ I can't count the number of times I thought, "I wonder how my old model would have performed on this instance?"
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
5/ And, relying on serialized models is so damn fragile.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
6/ With this architecture, you don't have to worry about all intervening changes to your code.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.