To a first approximation, in 2020 image codecs would parse and entropy decode in Rust, throw the resulting bits on GPU, and do everything else there.
-
-
But it doesn't because the hardware doesn't. We just look the other way around that. Browsers should not be touching GPU at all. Only apps dealing only with data created by their user or other highly trusted parties should have GPU access.
-
Strong disagree, as someone who has dealt with the complaints about power usage from not using the GPU enough.
- 1 more reply
New conversation -
-
-
Thanks! Hmm, I think exploring GPU programming has gone back up a few places in my personal projects list... time to read up on WebGL, perhaps.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.