Me and @YuriyODonnell are trying to push better programming model with first class GPU kernel enqueue. Would eventually remove the need for barriers, async compute and even command queues. And hopefully could say goodbye for the graphics API in the end.
-
-
-
That'd be great. Hopefully next is resources vs latency hiding vs scheduling granularity.
- 2 more replies
New conversation -
-
-
As long as you count SampleTexture as an assembly instruction, I can dig it.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
General Processing Unit
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
GPU as a service
-
This Tweet is unavailable.
New conversation -
-
-
At least I understand this
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
thankgod, can i restart my career then please
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I read that in school and took it to heart in my career; still remember it. I half-hoped the N64/Vérité generation would do it. When I worked on PS2 I thought we were almost there. When we got compute I thought, surely now? The long view lost every time :(
-
On the other hand, dedicated hardware can be considerably more power and area efficient <shrug>.
- 1 more reply
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.