I keep seeing comments about cell-based GPU glyph lookup in the pixel shader being a "new" way to render terminals. It is not new. It is ancient. A "GPU" that looked up one glyph per character cell is literally what the very first IBM PC had in it:https://en.wikipedia.org/wiki/Code_page_437 …
-
-
Hopefully with things such as DirectStorage there’ll be more understanding for leveraging programming with GPU accelerated tools. (Heck, with finally gpu in windows subsystems for Linux, maybe that’ll help too!)
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
This stuff has been the standard in games for over a decade, where it is usually considered "sloppy but good enough". I'm surprised how slow the current state is. I've always assumed that those "core subsystems" were optimized at _some point_ in the last ten years...
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Did you see this on terminal input latency? https://danluu.com/term-latency/
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Microcode, schmicrocode. Use a metal stencil inside the CRT to make a shaped electron beam! Charactron: https://en.m.wikipedia.org/wiki/Charactron#/media/File%3ACharactron_layout_1.jpg …pic.twitter.com/0ov4NYM9OA
- End of conversation
New conversation -
-
-
As somebody who has very little experience with GPU programming, when I was toying around trying to render in opengl, I wrote the glyph lookup on the CPU side, is there a massive benefit to make it happen in the shader, for relatively 'simple' tasks, like rendering a text editor?
-
Looking up texture data something a GPU is incredible at. I'm kind of curious what you actually did on the CPU.
- Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.