noticing an interesting feedback loop in the web: all the major engines were built on software renderers, which leads to certain optimizations and perf characteristics, which leads to web pages which rely on that, which pushes web engines to need/want those opts, and so on
-
-
The percentages don't really matter; if important/major pages run fine in vanilla gecko but not webrender, but a bunch of oddball pages run great in webrender, I don't think that's a win (and I don't think we could politically sell shipping that either)
-
I actually pretty much entirely disagree with your take—the biggest problem is that we don’t control the OS compositor, so we need invalidation and so forth in order to get good energy efficiency. We’ve already proven that you can get good FPS in the repaint-everything case.
-
I am certain we haven't? Tons of cases where a page just slaps 5+ text-shadows on something and we fall over completely. *even* if we cache the blurs, just compositing them is too expensive. glenn is heads down working on picture caching because we have so many of these bugs!
-
I knew you were going to bring up that case :) That is easy to fix: just cache the blurs together. Much easier than picture caching. The reason why we need picture caching, in my view, is energy efficiency, not to get 60 FPS.
-
caching the blurs together *is* picture caching?
-
Picture caching is a lot more than just text shadows. That’s why it’s been so much work to implement.
-
ok sure, but it's just "the right" solution to the text-shadow issue, in the same way that MIR was "the right" solution to borrowck, even though it wasn't a technical requirement :)
-
also energy efficiency is technically a requirement to hit 60fps -- my macbook pro can't even play fullscreen videos at 60fps when it starts thermal throttling :(
- 4 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.