If you were to launch a DDOS attack on programmers' ability to get things done, this is what it might look like.
-
-
Show this thread
-
Before, it was already crazy, because you could be running on the same CPU and GPU, but you were forced to use a different shading language per OS. Now you need *two* shading languages for one OS, because maybe you are running in the browser.
Show this thread
End of conversation
New conversation -
-
-
also doesnt glsl *already* work in the browser via webgl?
-
iirc the idea of webgpu is being able to do compute-shader-like things Still... it's kind of a mess. You also have to keep in mind that there is already two WebGLs based on different GLES versions :\
- Show replies
New conversation -
-
-
In order to write once, run everywhere these days you need to create or use a graphics layer abstraction and high level shading language. Metal, WebGPU, DX12 all believe they are alone in the universe it seems.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
What we need is another layer on top to fix everything
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Shading Languages are a crime against humanity!
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
You could just write to 1 and write transpiler make it for the other platforms. Actually, I just use the C preprocessor to convert between HLSL and GLSL.
-
It's exactly that attitude that makes everything terrible. Tech bru.
End of conversation
New conversation -
-
-
Why shader languages exist at all?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.