1/ I had a question about successfully handling high traffic on computationally expensive deep learning models (like DeOldify). I'm pretty new to this myself and therefore may be very wrong but I do have a take and I'm wondering what others think. Here's what I said:
-
-
Very thought-provoking. You're saying take advantage of designing with constraints in mind from the beginning? (make what you're making work on a 6s) Instead of defaulting to cloud's unlimited offerings?
-
In our case we were forced to think that way because of being 100% bootstrapped. I do really think that operating with constraints defined like that ahead of time really helps to spur creativity though. Cloud ML offerings can get really really expensive quickly.
- Show replies
New conversation -
-
-
I agree. Here is an interesting read I found which contains lots of insights about picking networks for mobile https://machinethink.net/blog/mobile-architectures/ …
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
The advantage of being on cloud is that it doesn't impose the users to have beefy systems to run this models, and putting it on cloud also is more feasible as you can spin up and shut down servers as you see the usage.
-
But cloud gpu memory is also very expensive, so makes sense to outsource and just work on perfecting your algorithms and models.
- Show replies
New conversation -
-
-
Think about it as providing a service. Scheduling your resources for users is more efficient than everyone having idling hardware. You also keep control over your source code which you can update without anyone noticing. The incoming unlabeled data helps to improve your models.
-
It's very, very expensive to do it as a cloud service. To the point where it completely changes what kind of business model you can offer. Control over code- do I really need that? I don't think so. And updates to the model only will happen every few months.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.