For most sites, a good http2 CDN with server-push would make them far faster than a from-scratch rewrite in hand-optimized js.
-
-
Replying to @asolove
: what I'm seeing in traces is that these things are CPU bound. It's about how much script gets loaded and when.
3 replies 0 retweets 4 likes -
Replying to @slightlylate
: H/2 only helps when you're sending lots of smaller modules. I'm seeing huge roll-ups as the default. Totally screws perf.
2 replies 0 retweets 2 likes -
Replying to @slightlylate @asolove
It's why chrome is writing a new javascript interpreter, because people are shoving SO MUCH CODE in, on the average
1 reply 0 retweets 0 likes -
Replying to @Aranjedeath
: sort of? Interpreter helps w/ memory use. Most devices are constrained that way.
1 reply 0 retweets 1 like -
Replying to @slightlylate @asolove
I know the target is memory use reductions, but simplifying the js->bytecode pipeline will help with code throughput
1 reply 0 retweets 0 likes -
Replying to @Aranjedeath
: sort of. It's complicated. Interpreter will run w/ --no-lazy. Means we'll process more code, but only once (on avg).
1 reply 0 retweets 1 like -
Replying to @slightlylate @Aranjedeath
: previously, early-parse meant only JIT'd code we'd actually hit. Now processing all code. Similar real-world startup
1 reply 0 retweets 1 like -
Replying to @slightlylate @Aranjedeath
: biggest win is memory. This is not a small thing; has cross-cutting (positive) impacts.
1 reply 0 retweets 1 like -
Replying to @slightlylate @Aranjedeath
: also, unlike JIT'd code, can perhaps store bytecode in on-disk code cache for future use.
2 replies 0 retweets 1 like
: we already store some things like this, big win on many large SW-using apps in CR53.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
& Web Standards TL; Blink API OWNER
Named PWAs w/
DMs open. Tweets my own; press@google.com for official comms.