From now on, any time I am reviewing programmer resumes for a job, if I ever see "WebGPU Working Group" on a resume, that resume goes instantly into the trash. Not bluffing. There has to be a penalty for pollution or people will keep doing it. I encourage others to treat resumes
-
Show this thread
-
Replying to @Jonathan_Blow
What's the solution you're thinking of? A good portion of the group fought for SPIR-V for years, Apple fought back with some alright reasons, they settled on a language that's really straightforward to translate back and forth to SPIR-V.
1 reply 0 retweets 9 likes -
Replying to @trishume
The problem is why can't I just program in whatever language I want, like we knew how to do in the 1970s, then gave up that capability for some reason. Re SPIR-V, come on, the corporate politics there are clear as day. You know they wouldn't "fight back" against MSL, even if
3 replies 1 retweet 20 likes -
Replying to @Jonathan_Blow
So given that the WebGPU developers don't have any influence over OS GPU abstractions, do you think they should have given up and not made anything, or write compilers that can accept all major shading languages on any OS, or are you saying SPIR-V?
1 reply 0 retweets 7 likes -
Replying to @trishume
I think they should work to fix the actual problem, which is that browsers are lame.
1 reply 0 retweets 8 likes -
Replying to @Jonathan_Blow
Okay so if nobody should work on abstracting GPU APIs in browsers, what about native? Given current APIs as fixed, do you endorse efforts to try to write a hardened abstraction over them, or just give up and everyone writes for each platform manually?
1 reply 0 retweets 5 likes -
Replying to @trishume
I think we should do what we did in the 1970s, which worked much better than what we are doing today.
1 reply 0 retweets 6 likes -
Replying to @Jonathan_Blow
But that's not an option if I want to program GPUs today. I want a CPU-style toolchain straight to machine code for major GPUs cross-platform too. But that's not an option unless I form a decade-long conspiracy of friends to become execs at Apple, MSFT, AMD, Intel, Nvidia...
2 replies 0 retweets 18 likes -
Replying to @trishume
Somebody has to be the adult in the room or things will continue to get worse.
1 reply 0 retweets 3 likes -
Replying to @Jonathan_Blow
Like which concrete people do you want to do what concrete action? Everyone should stop working on GPU stuff until execs pay attention to the strike? Just everyone complain on Twitter? Anyone who wants to use a GPU should try to become a MSFT GPU exec instead?
2 replies 0 retweets 11 likes
AMD, nVidia, and Intel each define their own long-term binary ISA and we compile to them. This is what we've been doing for literally decades on every other platform known to mankind. It's time for GPU vendors to stop getting a free pass.
-
-
Instead, for some reason we have these "APIs" that are supposed to make programmer easier, but all they do is multiply the problem, because now you have m APIs times n drivers, so it's an O(m*n) compat problem when it could have just been O(n). It's insane and absurd.
2 replies 1 retweet 6 likes -
Just stop with this crap, stop with the drivers, stop with the JIT, just stop. I know why they do it, but it's time to stop. Make a god damn long-term stable ISA for each GPU lineage and support it, and all of these problems go away.
0 replies 3 retweets 9 likes
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.