Caching across sites is a privacy risk in itself, because scripts can measure the time required to load a resource and therefore detect if a visitor has visited another site with the same resource before. That‘s why modern browsers no longer cache across sites.
Because the point of cache is to save time, not waste it. Like most naïve delays in response to timing attacks, that also doesn’t solve the tracking problem – if there’s any detectable difference (consider a cross-site tracking server that serves the content with a controllable delay) under any circumstances (consider network and disk load and availability), the mitigation is defeated.
Sites don’t share that many resources byte-for-byte anyway. The current solution is fine.
Random delays don’t stop timing attacks. You just need to gather more data before your estimate of the “unrandomized timing” is good enough for you to make your conclusions.
https://news.ycombinator.com/item?id=24894135