1/ I had a question about successfully handling high traffic on computationally expensive deep learning models (like DeOldify). I'm pretty new to this myself and therefore may be very wrong but I do have a take and I'm wondering what others think. Here's what I said:
-
-
I think the ideal solution would be a serverless one so you're not having a GPU humming all the time and in case you get a traffic spike you can spin up multiple nodes to handle it. I'm in the process of trying out Google AI Cloud Platform. Let's see if that works.
-
One other learning I'll share is that the compute time increases close to exponentially for image size in my experience...so you can get away with higher traffic for something like DeOldify if you limit images to say 256px.
- Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.