Yes, every page. To find links, and lots of important content for that matter, on many sites you need to execute JavaScript.. so the answer is "yes".
Here is a good example to guess how fast it is. This website http://bugs.chromium.org migrated very recently to a new UI and now is using the shadowRoot to store all the text contents. Only Googlebot evergreen can index it. Currently, there are ~20/30 pages indexed per day....
-
-
Why is this so hard to talk about this issue? Let's face it, some people are using web components massively in production and this *performance* issue cannot be denied. SSR won't really help to fix this since HTML doesn't offer any solution to serialize these contents /cc
@JohnMu -
sorry, missed this.. How are you measuring 20/30 pages per day?
- 7 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.