Hacker News new | past | comments | ask | show | jobs | submit login

Why are third party scripts allowed at all if a site is honoring 'do not track'?

For instance, the EFF page about DNT - https://www.eff.org/pages/understanding-effs-do-not-track-po... has an analytics link embedded in it. This fetches a pixel gif from https://anon-stats.eff.org/piwik.php?idsite=1&rec=1&url=http...

Why is this still allowed? What analytics are even useful when DNT is being honored? Could there be a google analytics style service that could honor DNT then?




anon-stats.eff.org is a subdomain of eff.org. It's not a third party.


This is Piwik, open source self-hosted analytics, and it does honor DNT by default.


I trust that it is honoring DNT, but I can't see what use it can be if that is so. What more can it be doing over & above the weblogs that my original request would have created?


Nothing, it ignores your request. It doesn't ignore it when DNT is disabled. As Piwik ignores DNT requests by itself, I guess nobody bothered to disable the snippet on server side when DNT is detected.


That's a shame. Although I guess it is difficult to avoid the request if you have lots of static pages with the script link in them.

Perhaps we could have something like a <link> or <img> track flag, so my browser could decide whether or not to fetch the link based upon DNT settings?


The problem is that DNT isn't verifiable DNT should be done in browser and in the original website by not loading the tracking scripts at all.

ATM how it goes is that the DNT header is attached to outgoing requests and you "hope" that the 3rd party is discarding them.


You do know, that you do not need a dedicated tracking script to track you serverside?


Eh? they know that some one accessed resource X from IP address Y that's not tracking.


They know that someone accessed resource X from IP address Y from browser Z with language preferences A, encoding preferences B, SSL cipher suites C (when https is used), DNT preferences D, sometimes also protocol support E (when upgrading to SPDY or HTTP/2.0), having a TCP/IP fingerprint of OS F. I guess I still haven't covered it all.


Exactly, that's the sole purpose and that's how DNT works. It's nothing more than a voice "I don't wish to be tracked" and an expectation to be listened to.


Hmm...the idea to block all third party trackers by default and only allow 2nd party trackers (also with a few restrictions) could be interesting.

I'm guessing this is not what the EFF is doing here, though.


Most sites, almost all, would break if browsers suddenly disallowed third party scripts.

If you're asking why it is "allowed" by their policy, I'd argue that it really boils down to if a script is tracking or not tracking. For example a CDN is a third party, but often aren't designed for tracking.


> Most sites, almost all, would break if browsers suddenly disallowed third party scripts

You would be surprised how many sites still render fine with blocking 3rd-party scripts. For instance, the OP article on "threatpost.com" rendered just fine with all 3rd-party scripts blocked. Actually, the page rendered even better this way, because the links on the page worked just fine, which is not the case if allowing 3rd-party scripts from "netdna-ssl.com".

Also, "break" can be defined many ways here, depending on whether you want a page to just render fine, or whether you want all features on a page to work properly[1].

And even without blocking all 3rd-party scripts, it is very beneficial to at least block 3rd-party scripts from ubiquitous sources[2], if only for page load speed.

[1] "all features" as in "all features which enhance a page arguably to the benefit of the user".

[2] Example of ubiquitous 3rd parties: https://github.com/gorhill/uBlock/wiki/Blocking-mode#raw-dat...


> Most sites, almost all, would break if browsers suddenly disallowed third party scripts.

I don't find that to be so. I use Firefox with AdBlock, NoScript, RefControl and Disable WebRTC. In NoScript, I've flagged third-party ad servers and trackers as untrusted. In RefControl, I block third-party referer by default. I use private browsing mode, and accept third-party cookies from visited sites, but all cookies are deleted when Firefox closes.

I do allow useful content from third-party scripts. But that doesn't include crap from Facebook etc. And it's very rare that I need to enable other third-party scripts to get what I want from sites. If there's something truly evil that I want to read, I fire up a LiveCD VM.


> I do allow useful content from third-party scripts.

And that's why they work. You allowed things like Google's and Jquery's CDN. If they blocked third-party absolutely then tons of sites would break.

You cannot use a bunch of addons which whitelist/blacklist content and assume that is remotely ballpark the same as outright blocking third party scripts. Those lists are heavily curated to make sure that stuff doesn't break.


I use RequestPolicy which is a plugin that allows you to control which third-party scripts are allowed to be loaded per domain. It is very lightly curated; they have a small set of default rules that allow sites to load "third party" sites that they control (e.g. their own CDN).

I have found that most sites work (including the one hosting the article being discussed) well enough to read, or even completely true to the author's intent (depends on if CSS is self-hosted or on a CDN that I haven't whitelisted). I even find that often it is actually more pleasant to have a basic experience when CSS doesn't load.

Sites that require me to allow a lot of third-party scripts and content to work force me to weigh the expected usefulness of the content vs the scumminess of the site. The creepier and more plentiful the third parties are, the more likely I am to press the X...

I very rarely have to allow Google and JQuery in my day to day browsing.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: