@paniq @cmuratori touché
-
-
-
@bernielomax@paniq That's not useful for what you actually want, though. It forces the form factor to be much higher than it would be. - Show replies
New conversation -
-
-
@cmuratori GPUs are going into CPUs now. -
@Xeekei It's been in there for a while, but that doesn't help you while we're still using discrete parts. - Show replies
New conversation -
-
-
@cmuratori We'll probably just get everything soldered to the PCB, if we haven't moved straight to everything-in-a-chipstack first, that is.Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
@cmuratori that's what NVLink is supposed to bring to the table in 2016, alongside faster GPU peer bridgingThanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
@cmuratori That's what PCIE 3.0 16x is. There's no other use for that slot. GPUs need custom power, memory, and thermal so no "socket." -
@jwatte@cmuratori I guess in theory HBM based GPUs could be socketed; they are pretty self contained, only leaves the power problem.
End of conversation
New conversation -
-
-
@cmuratori They are already integrated in the CPU, and performance keeps going up. And the heat... omg ;)Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.