I don’t actually need integer support in the shader. Do you need PF3 to work on ES2?
-
-
Replying to @pcwalton
Yes, since apple has all but abandoned webGL2, its fair to assume either it runs on webGL1 (with av exts like halfloat) or it won't run on 'web' for a long time.
2 replies 0 retweets 0 likes -
Replying to @rikarends @pcwalton
I was contemplating Pathfinder 3 as a font-atlas builder for Makepad since its either that or go do pixelfonts for lowDPI which apparently still most of the world has :)
1 reply 0 retweets 0 likes -
Replying to @rikarends
That’s a great use case. BTW I was thinking about making an SDF generator mode for PF3. Not quite sure how it will work yet.
2 replies 0 retweets 0 likes -
Replying to @pcwalton
I really like this idea of generating SDF's with pathfinder btw, it would give me atleast some dynamic scalability range, and it would take the pressure off of your core for AR/VR as well.
2 replies 0 retweets 0 likes -
Replying to @rikarends @pcwalton
Those GPU times are pretty great, but still 'huge' for a 120hz mobile device. Do you need do al the cpu tiles again when the camera changes? Or can you leave them be.
1 reply 0 retweets 0 likes -
Replying to @rikarends
You have to redo all the CPU tiles, but you can reproject them at some loss in quality (on VR I just do them once and reproject to both eyes). BTW I want to eventually move CPU tile generation to GPU, but I’m not sure it will be a win.
1 reply 0 retweets 1 like -
Replying to @pcwalton
So i'd say keep a healthy load on the CPU, the GPU is busy doing all them AR/VR things at high framerates :)
1 reply 0 retweets 0 likes -
Replying to @rikarends
That’s the idea, yeah. I want to have an optional path to do everything on GPU, but only if it’s proven to be a win (i.e. if the CPU would otherwise be idle it’s not a win). There’s more overhead when doing everything on GPU in compute shader.
2 replies 0 retweets 0 likes -
Replying to @pcwalton
I'd say having a high perf gpu-path for creating texture atlasses for fonts is already pretty great improvement for a browser stack. Btw on my 1080ti the 'gpu time' didnt scale by as much as i expected vs my laptop, is it bound in some way?
1 reply 0 retweets 0 likes
The GPU time is inaccurate and counts a lot of CPU time ever since I added pipelining. Need to fix.
-
-
Replying to @pcwalton
I'm trying to get to a point where you get to use GPU instancing on Vector elements. Maybe with a bit of vertex/pixelshader instance styling. That way you can draw gatrillion visual graph elements without all that bezier processing per item.
0 replies 0 retweets 0 likesThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.