Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's BS and has been since forever

https://games.greggman.com/game/panopticlick-hyperbole/

TL;DR they don't get enough traffic for the numbers to represent anything useful.



It may be BS, but that analysis seems too shallow to be at all convincing either. My big question is how much resolution you can get out of things like canvas finger printing and WebGL (or even sheer JS speed). Those are places where the analog reality underlying everything we develop on may peek through unless explicit (and inherently performance reducing) measures are taken. Even if two CPUs/GPUs/SoCs are fabricated on the same process and make it into the same bin, that doesn't make them identical in performance at the level finger printing cares about. In terms of user experience, essentially nobody is going to care about +/- single digit megahertz, or exactly how quickly upclocks/downclocks happen, or sub-1 FPS changes in edge case rendering or the like. But those could certainly leak bits that are unique to a specific chip, even between two devices that are "the same". Silicon fabrication is a probabilistic process, and bins are normally a matter of economics and performance not privacy.

Normally devs want everything running as fast as possible, and just leave it at that. But the more closely the software tracks the underlying hardware, the easier it would be to finger print too. And this is one area where iOS devices might well be at a disadvantage for a straight forward "best experience" implementation, precisely because they've put a lot of effort into minimizing things that can interfere with whatever is in the foreground.

I don't disagree that the relatively small (and undoubtedly skewed from the general population) size of the EFF's overall dataset is a limit for them, but "I have no idea but it seems fishy because my iPhone should be identical because I say so" isn't an analysis.


>My big question is how much resolution you can get out of things like canvas finger printing and WebGL (or even sheer JS speed). Those are places where the analog reality underlying everything we develop on may peek through unless explicit (and inherently performance reducing) measures are taken.

I'm not sure how the "analog reality" applies here. The CPUs and GPUs generate discrete results, and behave identically two other chips of the same model. You talked about variations in performance, but is there evidence that apple does this with iphones? They could very well running them at lower clocks than what they're capable of, ie. the chips come out of the fab being able to run at 1.6-1.8ghz, but apple runs all of them at 1.6ghz. Finally, even if the performance variation is there, the difference will have to be big enough that it doesn't get drowned out in the noise or other environmental variations. A phone that has been in a pocket would perform worse than one that's been sitting on a desk, because it's probably 10 degrees warmer, which means 10 less degrees of thermal headroom.

>I don't disagree that the relatively small (and undoubtedly skewed from the general population) size of the EFF's overall dataset is a limit for them, but "I have no idea but it seems fishy because my iPhone should be identical because I say so" isn't an analysis.

But the eff site tells you exactly what they're fingerprinting, and they're not fingerprinting performance. What you described might be possible, but is irrelevant to the discussion.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: