To a first approximation, in 2020 image codecs would parse and entropy decode in Rust, throw the resulting bits on GPU, and do everything else there.
-
-
Yes, but the GPU memory model is so much simpler that it's easier to attain. (For example, WebGL is memory safe already--it has to be, because it's exposed to content.)
-
Thanks! Hmm, I think exploring GPU programming has gone back up a few places in my personal projects list... time to read up on WebGL, perhaps.
Kraj razgovora
Novi razgovor -
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.