noticing an interesting feedback loop in the web: all the major engines were built on software renderers, which leads to certain optimizations and perf characteristics, which leads to web pages which rely on that, which pushes web engines to need/want those opts, and so on
I actually pretty much entirely disagree with your take—the biggest problem is that we don’t control the OS compositor, so we need invalidation and so forth in order to get good energy efficiency. We’ve already proven that you can get good FPS in the repaint-everything case.
-
-
I am certain we haven't? Tons of cases where a page just slaps 5+ text-shadows on something and we fall over completely. *even* if we cache the blurs, just compositing them is too expensive. glenn is heads down working on picture caching because we have so many of these bugs!
-
I knew you were going to bring up that case :) That is easy to fix: just cache the blurs together. Much easier than picture caching. The reason why we need picture caching, in my view, is energy efficiency, not to get 60 FPS.
- 8 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.