| ▲ | Springtime 5 hours ago | |
> Disabling JavaScript actually greatly increases your fingerprint as not many users turn it off, so that instantly puts you in a much smaller bucket that you need to be unique in. I've heard a handful of people say this but are there examples of what I would imagine would have to be server-side fingerprinting and the granularity? Since most fingerprinting I'm aware of is client-side, running via JS. While I expect server-side checks to be limited to things like which resources haven't be loaded by a particular user and anything else normally available via server logs either way, which could limit the pool but I wonder how effective in terms of tracking uniqueness across sites. | ||
| ▲ | ranger_danger 2 hours ago | parent [-] | |
In addition to server-side bits like IP address, request headers and TLS/TCP fingerprints, there are some client-side things you can do such as with media queries, either via CSS styles or elements that support them directly like <picture>. You can get things like the installed fonts, screen size/type or platform/browser-specific identifiers. https://fingerprint.com/blog/disabling-javascript-wont-stop-... There is also a method of fingerprinting using the favicon: https://github.com/jonasstrehle/supercookie | ||