Computing went from mainframe + terminals, to standalone microcomputers, to a hybrid model where data / models / heavy logic are centralized in a datacenter while your terminal still runs a lot of stateless application logichttps://twitter.com/fchollet/status/1423702456790257666 …
-
-
Of course if you're just rendering web pages, cost-cutting isn't a factor, but if you're running deep learning models, it is.
Show this threadThanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Even if end users are willing to pay for the compute, it is unlikely they will use it 100% of the time and this is not mutually exclusive with the terminal model. I also don’t think privacy is safer with a device that can be stolen or lost.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Also people will realise that a good deal of cloud spend is economically batshit mental.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Yep…. and you are totally buggered if the internet goes down, what happens quite frequently in rural areas…
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
+ latency + reliability (long-tail latency) + power (npu is lower power than send and receive over mobile phone radio in some cases)
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
This Tweet is unavailable.
-
There has always been this dream of the thin client/network computer. The reality is, it is not even a good design goal to develop networks that are made for the throughput/latency/QoS required. Networks are not „pipes“ or „air“ - are energy burning equipment with failures too
End of conversation
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.