I think the end result of increasing usage of client-side filtering is going to be a strong move towards the tracking / advertising being heavily integrated into site functionality, without an easy way to remove it. Server-side middleware makes this easy for the site operators.
Conversation
But the party who wants the tracking can't trust that the site isn't falsifying data, and there's a huge commercial incentive to falsify it. That's why killing third party content is such a good approach.
1
1
It exploits the adversarial nature between the parties who are screwing the user over.
1
I don't think it makes that much difference if the code is pulled in by middleware on the server or the browser. Either way, the site can interfere with it, and in both cases they can try to defend against that with obfuscation and having lots of constant churn changing it.
1
The older approach that they took of having sites include a script tag pulling in their third party analytics is not working well for them anymore. They're already getting inaccurate, biased data due to the huge assortment of ad-hoc best effort anti-tracking mechanisms deployed.
1
For example, ubuntu.com has Google analytics on it but I wouldn't be surprised if >50% of the visitors to the site are blocking it in their browsers. The groups of people blocking vs. not blocking it are also not the same, so these kinds of analytics aren't accurate.
1
On the other hand, if they do it via their servers and first party code, which they might already do, they are going to get far more accurate and representative data. It can still be 3rd parties providing tracking code / middleware instead of having the browser load from them.
1
It's in their interest to do it through the same domain / connections and integrated into how the site works. For example, in Reddit's redesign, and the approach in many overhauled news sites, they made things very dynamic in the client with lots of API calls to the server.
1
They get a ton of analytics data simply from the basic operation of the site, inseparable from basic functionality. They can and do also integrate third party analytics into what they serve from their own servers. Trying to identify and block it separately is a losing battle.
1
I also fully expect that sites are going to move towards serving obfuscated content requiring complex JavaScript to display anything, probably even with DRM for all the basic content so they can use the terrible laws making it illegal to bypass DRM to forbid blocking the ads, etc
1
1
Basically, I don't see a good alternative to an approach like the Tor Browser in the long-term, but hopefully on a better base than Firefox and with a much more exhaustive approach to preventing fingerprinting and network-based attacks. I see this blacklisting as only a stopgap.

