noticing an interesting feedback loop in the web: all the major engines were built on software renderers, which leads to certain optimizations and perf characteristics, which leads to web pages which rely on that, which pushes web engines to need/want those opts, and so on
-
Show this thread
-
e.g. drawing things is so expensive that it's worth it to do lots of book-keeping to avoid drawing things twice. webdevs then see incredibly complex but static things are 'free'. now web engines *need* aggressive caching because tons of pages have super complex static elements
1 reply 0 retweets 17 likesShow this thread -
this feedback loop has been the biggest blow against the original "dream" of webrender, which was that *maybe* the win from gpu rendering was enough that you could just draw a page from scratch at 60fps without async scrolling, cached layers, invalidation, etc ya can't
2 replies 0 retweets 10 likesShow this thread -
Replying to @Gankra_
The vast majority of pages do just fine with WebRender when repainting every frame. There are a bunch that don’t, but there are also a bunch that perform badly with the traditional stack.
1 reply 0 retweets 2 likes -
Replying to @pcwalton
The percentages don't really matter; if important/major pages run fine in vanilla gecko but not webrender, but a bunch of oddball pages run great in webrender, I don't think that's a win (and I don't think we could politically sell shipping that either)
2 replies 0 retweets 0 likes
There are plenty of pages that run a lot better in WR that aren’t “oddball”. You’re just being pessimistic :)
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.