1/ I had a question about successfully handling high traffic on computationally expensive deep learning models (like DeOldify). I'm pretty new to this myself and therefore may be very wrong but I do have a take and I'm wondering what others think. Here's what I said:
-
Show this thread
-
2/ "As far as scaling goes- this is something that we actually wound up “outsourcing” for DeOldify. In that we’re just licensing the model and letting the licensing companies (like MyHeritage) figure out the hard part of scaling. "
2 replies 0 retweets 3 likesShow this thread -
3/ "Because we knew that would be hard and quite frankly not something we’re comfortable with taking on as a two person team. Before we decided to go down this route, we were originally going to go with an app that strictly ran on iPhone hardware (6s+)."
2 replies 0 retweets 3 likesShow this thread -
4/ "Yes,this actually worked, and quite well (noticeably better than open source). As you can imagine- getting rid of servers like this solves the scaling problem :) We’ve also considered doing a desktop app (haven’t ruled it out, just not right now). "
1 reply 0 retweets 3 likesShow this thread -
5/ "Again- deferring huge, expensive computation to the users, hence avoiding a lot of complication on our end. And we’d argue it’s better for the users."
3 replies 0 retweets 3 likesShow this thread
Yeah I think coming at this from a web development background (as I do) helps give perspective on the possibilities and trade-offs.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.