The fact that there's no reason for you (or any given individual dev) to switch to WCs today doesn't mean that there's not value in the ecosystem as a whole shifting to adopt a platform-native component model eventually, as the standards evolve.
-
-
Replying to @graynorton @dfabu and
In practice, the web ecosystem will never standardize on any particular userland solution, and shouldn't. Layered APIs don't really change this equation; they're just a new way for the platform to ship features, with a "pay-for-what-you-use" consumption model.
1 reply 0 retweets 4 likes -
Replying to @graynorton @slightlylate and
The problem is that we won't standardize on custom elements with shadow DOM, either. Thanks to slot SSR rehydration issues, userland SSR solutions are faster to paint than the "standard" and just as fast to TTI.
1 reply 0 retweets 0 likes -
Replying to @dfabu @slightlylate and
It's not clear that the rehydration challenges with Shadow DOM are unsolvable. In the worst case (tho I doubt it will come to this), it will be back to the drawing board. Userland solutions are a good measuring stick for platform features, which inevitably have a longer arc.
2 replies 0 retweets 2 likes -
Replying to @graynorton @dfabu and
I'm also skeptical of the long-term need for SSR, particularly as practiced today with ultra-long late pauses when massive JS bundles eventually arrive. The "do less work total" plan is a good one.
3 replies 2 retweets 5 likes -
Replying to @slightlylate @graynorton and
I think SSR is important, but really all content should be in light DOM for SEO anyway. Rehydration of shadow should be of lower importance.
1 reply 0 retweets 0 likes -
Replying to @jthoms1 @graynorton and
Again, if the crawlers run script and understand Shadow DOM, what's the pressing need to contort ourselves around this circa-2013 story of how content should be structured?
1 reply 0 retweets 2 likes -
Replying to @slightlylate @graynorton and
Maybe my understanding is outdated then. Are you saying it shouldn't matter because all crawlers are just going to run the script anyway?
1 reply 0 retweets 0 likes -
Replying to @jthoms1 @graynorton and
I don't think we're far away from that. Understanding (and reducing) that gap is worth doing.
1 reply 0 retweets 3 likes -
Replying to @slightlylate @jthoms1 and
We are very far away from that. I work on a site with millions of pages; Google's crawler team has been explicit for years that they refuse to run our script on millions of pages. Thousands of pages, maybe, but not millions, not now, not ever. They tell big site owners to SSR.
1 reply 0 retweets 0 likes
I obviously don't speak for Search or the crawler team, but I'm curious how big your script payload is.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
& Web Standards TL; Blink API OWNER
Named PWAs w/
DMs open. Tweets my own; press@google.com for official comms.