Can someone paste their results (or at least bits of fingerprinting entropy) from https://panopticlick.eff.org with the latest Firefox?
With the fancy new anti-fingerprinting Safari on macOS Mojave I get just over 14.5 bits of entropy with the most entropic source being my canvas fingerprint (1 in 600).
With Safari on iOS I get 11.71 bits of entropy, with the most entropic value being my screen size and color depth.
I think it's funny that panopticlick gives me a little red X for not allowing trackers from companies that have "promised" not to track me. I have no incentive to do so, as I do not get any sort of compensation if they are found to be in violation of those terms.
Just sticking this comment here since I think most people would like to see how the site works. People are interpreting the numbers incorrectly, as I also did at first even though it says at the top exactly how they're measuring these numbers.
Your entropy is determined exclusively based upon the people that have used the site in the past 45 days. For now that number is about 204k. So for instance if you see something has an etropy of 9.08 (as my user agent does) you'd also see that it says 1 in 542.15 browsers have this value. 2^9.08 ~= 542.15. The ~ there only because the thousandths and onward digits are not showing. My exact entropy would be about 9.08254825596. All that means is that of the ~204k people that have used the site in the past 45 days, 377 had the same user agent.
The problem with this is that the people using this site are going to be a heavily biased sample. And so by tuning to reduce your entropy, you are not actually reducing your trackability but instead making yourself look more like the subset of people that are actively using this site. And this becomes an even bigger problem since I do imagine this site is actively shared on more technically orientated sites, such as this one. But the settings of technically orientated users are often going to vary somewhat significantly from the settings of the other 99% of users.
The point of this is that by working to reduce your entropy on this site you may, ironically, end up making yourself more trackable. So the numbers should be taken not as a measurement of trackability, but rather as an interesting insight of your browser/setting differences/similarities of other users with the site.
---
Also, 100% agreed on the silliness of them marking you down for not allowing cookies marked Do Not Track friendly. Until such things are enforced, in code and ubiquitously, they're meaningless unenforceable promises that rely on tracking and advertising corporations never lying.
17.62 bits on firefox, 11.0 on Tor, 17.63 on chrome.
On firefox, the big contributors are HTTP headers (my native language is announced), hash of WebGl fingerprint and time zone.
On Tor big contributors are hash of webGL fingerprint, screen size.
On chrome, they are system fonts, hash of canvas fingerprint, user agent, and time zone.
I am not too concerned about the fingerprinting in firefox since I have strict blocking on, ublock origin, and separate containers for facebook and google. Based on the small amount of data facebook has on me, all the blocking is working pretty well.
Similar results for me. Does anyone know if it's possible to turn off WebGL, and if so, how? AFAIK I never use it for anything and I'd rather have increased anonymity. (Assuming disabling it prevents it from being used for fingerprinting.)
Edit: Answering my own question. In `about:config`, change the `webgl.disabled` preference from `false` to `true`. This reduced the "bits of identifying information" from WebGL from 11.26 to 2.56.
CanvasBlocker actually increases your track-ability because the consistent factor is now that you have a changing canvas fingerprint (which almost no one does).
This is why Safari tries to give a universal canvas fingerprint so you can "blend in" with other users.
I agree that a universal canvas fingerprint is better in principle, but practically who is going to write a script to search for all visitors who only differ by their canvas fingerprint and then identify them as one browser because the fingerprints are non-standard?
Practically, it requires little more work than creating a canvas fingerprint framework itself! If someone puts in the effort to write a framework that tracks you via canvas fingerprints, it’s little more work to add to the script with another one that performs a simple diff to find people trying to evade it.
Panopticlick's numbers are extremely confusing and borderline useless.
On my initial run, I got an overall entropy of 17.63. My two biggest identifiers were screen resolution (1000x595x24 which was approx 1/22000 browsers) and webgl hash (approx 1/3800 browsers). I fixed screen resolution to 1000x600x24 (approx 1/85 browsers) and disabled webgl hashing (approx 1/6 browsers) and the overall entropy did not change one iota, despite also closing browser, flushing cache and cookies, etc. I gave it another run with a deliberately weird resolution (1420x701 which was something like 1/105000 browsers) and once again, the overall entropy was exactly 17.63. So based on my experiment, it seems that screen resolution and webgl hash have no effect whatsoever on [Panopticlick's] overall entropy score.
An update on last night's experiment, if anyone cares. The next largest identifier was system fonts (approx 1/1300 browsers). I set `browser.display.use_document_fonts=0` which hid the system fonts (now the same as approx 1/10 browsers) and my overall entropy dropped to just below 11 bits. At this point, none of the metrics were less common than 1/10 browsers, so I figured I wouldn't be able to do better than that.
As a side note, I ended up re-enabling system fonts because disabling them broke a large percentage of web sites' CSS.
The numbers don't make much sense to me. On FF I get 14.05 with NoScript active. Curiously the headers increase from 1.68 bits to 3.47 when NoScript is running.
I'm curious about the difference between things like NoScript and native Brave script blocking.
In particular I was going to make a snarky comment that the site seems to, appropriately, not work when script blocking is enabled on Brave. I do get the site to do the refresh business a couple of times, but no results are ever displayed.
> On Tor big contributors are hash of webGL fingerprint, screen size.
Doesn't tor randomise the window size on startup? Though I guess it chooses some sensible size for your screen which is then leaking info about your screen size (in a pretty indirect way).
Not quite correct. It automatically picks the browser window size based on the monitor its being displayed on, in some multiple of 200x100. There is no randomization on every run.
I wonder if these fingerprint checks look for the more stealthy and sinister approaches, like localhost port scanning [1] and specific CSS selector behavior...?
True, in the current situation, we can only "limit" fingerprinting. This is the result of the characteristics of the sandbox we use for the Web. Remove Javascript, and most of these problems go away.
This is why I stay attached to making simple HTTP apps that don't require JS, but this is clearly not the direction the web is going at this time.
Please note: the fingerprinting protection in this blog post is different from the resistFingerprinting about:config pref which would affect your entropy bits on panopticlick.
Interestingly enough, uBlock Origin actually stops that site from working, seems to break the fingerprinting step. If i disable uBlock, I get 16.63 bits of identifying information. Likewise the canvas fingerprint is the biggest, in my case 1 in 101154.
You can draw with different fonts and background colors, then grab the raw pixel values and hash them. The hash will be different depending on the versions of fonts installed, the OS, the GPU, the browser's text rendering algorithms, and the subpixel order/orientation of the display.
They should have standardized the font rendering for canvas. For those rare graphs using canvas i can live without Cleartype and only use normal anti aliasing, and with modern high DPI displays you hardly even need that in the first place anyway.
I mean, one of the major reasons for using canvas over DOM is to get pixel perfect placement of things, like text connecting to an arrow. If your fonts suddenly change size that won't work anymore. SVG has the same problem, on some computers with slightly larger letter spacing a line might become too long so it wraps and become two lines, totally spoiling the desired diagram.
There are many legitimate uses. Vendors have experimented with making canvas readback opt-in (with a popup) but I don't know if it'll ever ship because it simply breaks too many websites. Sometimes it's used at page load to generate variants of a single image to reduce file sizes, or used by games to prepare image assets before they start up.
Longer: this is actually a viable way to do many types of fingerprinting, not just canvas. I'll give an example. In a graphics class I took our professor gave us output images to compare to. Two people with the same model computer, same specs, would frequently have a pixel or two different from one another. Change the specs and you're easily a dozen off. Worse than that, the pixels that are off from the original image can be different pixels. This comes down to the silicon lottery. So if you can think of anything that you can access where you can get the user's computer to do some sort of floating point calculation, you can probably get a fingerprint out of that.
So to fix this problem, you'd have to figure out how not just to make pixel perfect images, but for two CPUs of different types (which even same type doesn't currently) to always calculate the save answer to the same precision, every time. There's tricks that can be done like rounding, but it gets hairy really fast and becomes unpractical. But if you do know how to solve the problem, I'm sure people would really appreciate the answer.
Random, probably uninformed thought: I wonder if the solution could be LESS determinism rather than more. If you could make it so the same hardware rendered pixels in a slightly different (random) way each time, it would no longer be possible to determine if you were looking at the same machine.
That's an interesting idea. It might be a good way to circumnavigate this problem. But there are some drawbacks. Maybe there's a lot of things we could get away with actually needing FP16 accuracy (like iterative methods can sometimes do this, especially in ML) but call FP32 but there's plenty of times where FP32 matters. So I guess it is highly dependent upon those issues and where you can get away with them. But further, how do you enforce that? I think it is interesting though.
Firefox deploys two different forms of fingerprint protection:
1) blocking known fingerprinting scripts.
2) blocking underlying techniques.
The second is the one people here are talking about, and it can be enabled by going to your settings and turning on resistFingerprinting. Keep in mind it will do things like normalize your time zone and decrease timer precision.
Nope. The linked mozilla blog post clearly talks about the fingerprinting settings, and the comment I replied to did not specify anything related to "blocking underlying techniques".
This entire thread is filled with people comparing browser results on a site that breaks down underlying fingerprint techniques, while asking questions like, "is there a way for me to disable WebGL?", and "I wonder if they block localhost port scanning?".
I understand what the original posted link is talking about, but the specific thread you're currently on is very clearly talking about more than whether or not Panoptoclick's tracking script is blocked. They're talking about how well different browsers can resist the techniques it uses[0]. Why else would anyone be comparing their results to Tor?
If the thing in the OP doesn't do what a bunch of the comments here are discussing -- it is indeed important to point that out.
I had assumed it did, cause why else would we be discussing it here, and neither the OP post nor the FF setting are very clear about what it does. So I would have been thinking it was protecting me.
Likewise. It makes no difference whether I enable or disable the Fingerprinters checkbox.
Maybe due to the "uBlock Origin stops it from working" mentioned elsewhere, or some other glitch. Disabling uBlock Origin on panopticlick.eff.org didn't make a difference.
>With the fancy new anti-fingerprinting Safari on macOS Mojave I get just over 14.5 bits of entropy with the most entropic source being my canvas fingerprint (1 in 600).
That's actually pretty good, considering tor browser (which has resistfingerprinting enabled) with default window size (1000x1000) has 14.82 bits of entropy.
Firefox with uBlock Origin and JavaScript disabled: 9.81 bits, one in 900.21 browsers.
Google Chrome with uBlock Origin: at least 17.68 bits, unique among the 209,744.
iPhone 8, Safari, some adblocker: the same (unique). Probably I should update OS, I'm using pretty old version.
Following this advice, I discovered an extension that automatically spoofs the user agent and does some other things. Haven't tested yet, but it seems to be actively maintained.
Since I'm on the topic, other two lesser known extension I have are CanvasBlocker (fakes canvas fingerprinting (or disables it)) and Privacy Badger (heuristically detects and disables trackers; complements ublock).
You have to stick with known popular user agents. To mitigate tracking by UA you need a randomized user agent that changes periodically. Panopticlick won't be able to account for that in its stats. It's not a good idea to switch UA on every request since it will be hard to diagnose breakage caused by a site that rejects particular UAs.
"Periodically changing user agent" sounds pretty unique if you ask me. Especially if the extension isn't super-clever and changes the user agent for accesses that happen on the same page.
And the fingerprinter could be super clever and look for features that your purported browser isn't supposed to support... And if your browser does support them, that's a strong identifier.
Unless you use some obscure browser, it is better to use your real user agent. If you keep your browser and operating system up to date, chances are it will be one of the most popular ones.
Your UA will correlate with other means of fingerprinting you making you more common. Being clever can make things worse.
For example, the most common UA is from an iPhone, but the most common screen width is 1920 pixel. If you decide to make your UA an iPhone with a 1920 pixel screen, then you will be easily identified.
Panopticlick says I have "strong protection against Web tracking" but amiunique.org says I'm unique. Though amiunique also claims my TOR is fingerprint is one of six.
I get very different results from each. Some don't quite make sense to me. For example, amiunique says Timezone 3.37%. Panopticlick says 1 in 16, so half as bad. But how the hell is that timezone so identifying? I live on the west coast, how is that timezone so identifying?
With the fancy new anti-fingerprinting Safari on macOS Mojave I get just over 14.5 bits of entropy with the most entropic source being my canvas fingerprint (1 in 600).
With Safari on iOS I get 11.71 bits of entropy, with the most entropic value being my screen size and color depth.