1/ I had a question about successfully handling high traffic on computationally expensive deep learning models (like DeOldify). I'm pretty new to this myself and therefore may be very wrong but I do have a take and I'm wondering what others think. Here's what I said:
-
Show this thread
-
2/ "As far as scaling goes- this is something that we actually wound up “outsourcing” for DeOldify. In that we’re just licensing the model and letting the licensing companies (like MyHeritage) figure out the hard part of scaling. "
2 replies 0 retweets 3 likesShow this thread -
3/ "Because we knew that would be hard and quite frankly not something we’re comfortable with taking on as a two person team. Before we decided to go down this route, we were originally going to go with an app that strictly ran on iPhone hardware (6s+)."
2 replies 0 retweets 3 likesShow this thread -
4/ "Yes,this actually worked, and quite well (noticeably better than open source). As you can imagine- getting rid of servers like this solves the scaling problem :) We’ve also considered doing a desktop app (haven’t ruled it out, just not right now). "
1 reply 0 retweets 3 likesShow this thread -
5/ "Again- deferring huge, expensive computation to the users, hence avoiding a lot of complication on our end. And we’d argue it’s better for the users."
3 replies 0 retweets 3 likesShow this thread -
Replying to @citnaj
Sounds right. People will let you do subscriptions but doing a pay per use plan sounds annoying, especially with something like this where the result may not always be what they hoped for (no offense). Puts you in a tricky situation of having to guess how much people will use it.
1 reply 0 retweets 1 like -
-
Replying to @citnaj
Couldn't it just be client side code on a website though? I've barely looked at web tech but like, there are whole games that run client side. The PWA stuff google was pushing should let clients cache it all too I thought. If you're already open source seems like that'd work.
2 replies 0 retweets 1 like
On client side code: There's onnx.js that looks quite promising for that. Not sure how big of a model you can do with that, etc.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.