Which practices in frameworks combined with which tools reduce (or eliminate) the cost of unused code? Which tools could we be building? Which asynchronous programming models help to delay the cost of code to the point where it's being used?
Deferred loading means waiting until the user interacts to fetch and evaluate code, showing a spinner at that point. Deferred eval means fetching code up front as inert content and evaluating it on demand. Lazy parsing means eval up-front but hitting lazy-parse heuristics
-
-
TTI matters a lot, but so does how long users have to wait for spinners, how much "deferral" techniques trigger lazy jank.
-
Also, how much does "background fetching" affect these heuristics, where background fetch means optimize for TTI but download and eval the payload in the background (again, with the matrix of deferral options)
-
These techniques have different tradeoffs depending on how expensive network is vs. CPU. And sometimes expensive means literal money.
-
These are all the questions I could use data on more than "how many bytes is a React hello world" as we work on the next iteration of Ember tooling.
-
Yep, definitely some good things to study here. Some are thinks we're starting to track and expose more seriously in the browser - like overall input latency metrics. Also we're studying the effect of lazy loading iframes and images, so that's a related piece...
-
But it sounds like a lot of what you're talking about is experimenting with frameworks and measuring at that level. Eg. I don't think we can automatically detect spinners in a generic way. Maybe a need for some co-ordination APIs though?
-
I'm hoping that in addition to First Input Delay, we'll get Worst Input Delay into the CrUX report. That should help capture some of what you're talking about. But custom telemetry from framework code seems strictly superior to me...
-
I'd be very interested in collaborating on this and adding support to Ember for anything we come up with.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.