Hacker News new | past | comments | ask | show | jobs | submit login

Caching across sites is a privacy risk in itself, because scripts can measure the time required to load a resource and therefore detect if a visitor has visited another site with the same resource before. That‘s why modern browsers no longer cache across sites.

https://news.ycombinator.com/item?id=24894135




Why not add a random 1000-3000 ms delay to making the cached resource available? Timing attacks are not a new phenomenon.


Because the point of cache is to save time, not waste it. Like most naïve delays in response to timing attacks, that also doesn’t solve the tracking problem – if there’s any detectable difference (consider a cross-site tracking server that serves the content with a controllable delay) under any circumstances (consider network and disk load and availability), the mitigation is defeated.

Sites don’t share that many resources byte-for-byte anyway. The current solution is fine.


Caches also save bandwidth - for the user, for the server, and for the potentially overloaded network as well.


Random delays don’t stop timing attacks. You just need to gather more data before your estimate of the “unrandomized timing” is good enough for you to make your conclusions.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: