Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I doubt it. As explained by GitHub user tbrandirali, the stated goals seem to be inherently contradictory. Quoting in part:

"This internal contradiction is further demonstrated by the fact that the proposed solution to prevent misuse by websites - holdbacks - is to simply sabotage the functionality of the system itself, by making attestation probabilistic. This is not a workable solution to the problem: if the holdback rate of requests is low enough, the denial of service to legitimate users will simply be a cost of business that websites will accept; if instead it is high enough, websites will not use this system as it does not provide meaningful enough information, even for analytics purposes, due to the high uncertainty. There is no goldilocks zone where this system is useful but not open to abuse by implementer websites. You're either implementing a feature that can - and most likely will - be used by websites to exclude unattested clients, or you're implementing a useless feature."

https://github.com/RupertBenWiser/Web-Environment-Integrity/...



Why wouldn't a 10% holdback work? Would a company consider it "simply a cost of business" to block 10% of people at random? That's going to cause a huge amount of support load and probably a lot of negative press. 90% of data will still be good for analytics.


If the holdback rate is low enough that the data is "good enough for analytics", then you may as well not have a holdback rate at all.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: