Remix.run Logo
lpapez 6 hours ago

Very cool research and wonderfully written.

I was expecting an ad for their product somewhere towards the end, but it wasn't there!

I do wonder though: why would this company report this vulnerability to Mozilla if their product is fingeprinting?

Isn't it better for the business (albeit unethical) to keep the vulnerability private, to differentiate from the competitors? For example, I don't see many threat actors burning their zero days through responsible disclosure!

valve1 6 hours ago | parent | next [-]

We don't use vulnerabilities in our products.

mtlynch 5 hours ago | parent | next [-]

I don't understand what you mean. What separates this from other fingerprinting techniques your company monetizes?

No software wants to be fingerprinted. If it did, it would offer an API with a stable identifier. All fingerprinting is exploiting unintended behavior of the target software or hardware.

giancarlostoro 5 hours ago | parent | next [-]

It makes sense to me, they're likely not trying to actually fingerprint Tor users. Those users will likely ignore ads, have JS disabled, etc. the real audience is people on the web using normal tooling.

Gigachad 2 hours ago | parent | next [-]

They can just flag all Tor users as high risk. They don't strictly need to fingerprint them when it's generally fine for websites to just block signups for Tor users or require further identification via phone number or something.

You want fingerprinting to identify low risk users to skip the inconvenient security checks.

baobabKoodaa 5 hours ago | parent | prev [-]

Uhh okay, so they do exploit vulnerabilities, they just try to target victims who can be served ads? What a weird distinction.

zamadatix 4 hours ago | parent | next [-]

Most users seem to not care about ad tech/tracking as much as technical users. Even further, most seem to want to enable more tracking to [protect the children or whatever the reason is] pretty regularly (at least in opinion polls about various legislation). ToR users are not at all like that + could be harmed in a very different way... so I think it's fair to frame them differently even if I'd personally say people should be wanting to treat both as similar offenses because neither should be seen as okay in my eyes.

godelski 22 minutes ago | parent | next [-]

  > Most users seem to not care about ad tech/tracking
I don't think this is true.

Most people don't understand that they're being tracked. The ones that do generally don't understand to what extent.

You tend to get one of two responses: surprise or apathy. When people say "what are you going to do?" They don't mean "I don't care" they mean "I feel powerless to do anything about it, so I'll convince myself to not care or think about it". Honestly, the interpretation is fairly similar for when people say "but my data isn't useful" or "so what, they sell me ads (I use an ad blocker)". Those responses are mental defenses to reduce cognitive overload.

If you don't buy my belief then reframe the question to make things more apparent. Instead asking people how they feel about Google or Meta tracking them, ask how they feel about the government or some random person. "Would you be okay if I hired a PI to follow you around all day? They'll record who you talk to, when, how long, where you go, what you do, what you say, when you sleep, and everything down to what you ate for breakfast." The number of people that are going to be okay with that will plummet. As soon as you change it from "Meta" to "some guy named Mark". You'll still get nervous jokes of "you're wasting money, I'm boring" but you think they wouldn't get upset if you actually hired a PI to do that?

The problem is people don't actually understand what's being recorded and what can be done with that information. If they did they'd be outraged because we're well beyond what 1984 proposed. In 1984 the government wasn't always watching. The premise was more about a country wide Panopticon. The government could be watching at any time. We're well past that. Not only can the government and corporations do that but they can look up historical records and some data is always being recorded.

So the reason I don't buy the argument is because 1984 is so well known. If people didn't care, no one would know about that book. The problem is people still think we're headed towards 1984 and don't realize we're 20 years into that world

pmontra 2 hours ago | parent | prev [-]

In my experience those users express a mix of surprise and irritation when they get ads about something they did minutes or hours before, but they accept that's the way things are.

I joke that I'm a no-app person, because I install very few apps and I use anti tracking tech on my phone that's even hard to explain or recommend to non technical friends. I use Firefox with uMatrix and uBlock Origin and Blockada. uMatrix is effective but breaks so many sites unless one invests time in playing with the matrix. Blockada breaks many important apps (banking) less one understands whitelisting.

exe34 5 hours ago | parent | prev | next [-]

Well presumably they want to make money.

adastra22 4 hours ago | parent | prev [-]

Painting fingerprinting as vulnerability exploit is your own very biased and very out-of-norm framing.

SiempreViernes 3 hours ago | parent | next [-]

Instead of trying convince-by-assertion, maybe you could try offering an actual objection to the argument raised up-thread?

On what basis do you claim that software developers, who did not establish a means of for third parties to get a stable identifier, nevertheless intended that fingerprinting techniques should work?

strbean 3 hours ago | parent [-]

There's a pretty big difference between:

1) wanting functionality that isn't provided and working around that

and

2) restoring such functionality in the face of countermeasures

The absence of functionality isn't a clear signal of intent, while countermeasures against said functionality is.

And then there is the distinction between the intent of the software publisher and the intent of the user. There is a big ethical difference between "Mozilla doesn't want advertisers tracking their users" and "those users don't want to be tracked". If these guys want to draw the line at "if there is a signal from the user that they want privacy, we won't track them", I think that's reasonable.

maltelau an hour ago | parent [-]

The presence of the "Do Not Track" header was a pretty clear indicator of the intent of the user. Fingerprinting persisted exactly in the face of such countermeasures.

foltik 4 hours ago | parent | prev [-]

How would you frame it?

sodality2 5 hours ago | parent | prev | next [-]

Side channels that enable intended behavior, versus a flat-out bug like the above, though the line can often be muddied by perspective.

An example that comes to mind that I've seen is an anonymous app that allows for blocking users; you can programmatically block users, query all posts, and diff the sets to identify stable identities. However, the ability to block users is desired by the app developers; they just may not have intended this behavior, but there's no immediate solution to this. This is different than 'user_id' simply being returned in the API for no reason, which is a vulnerability. Then there's maybe a case of the user_id being returned in the API for some reason that MIGHT be important too, but that could be implemented another way more sensibly; this leans more towards vulnerability.

Ultimately most fingerprinting technologies use features that are intended behavior; Canvas/font rendering is useful for some web features (and the web target means you have to support a LOT of use cases), IP address/cookies/useragent obviously are useful, etc (though there's some case to be made about Google's pushing for these features as an advertising company!).

subscribed 4 hours ago | parent | prev | next [-]

Iffy vs grossly unethical.

rockskon 7 minutes ago | parent [-]

Someone discovering and making this public it doesn't mean others haven't independently discovered it.

OneDeuxTriSeiGo 4 hours ago | parent | prev [-]

A vulnerability is distinct from unintended behavior.

Unintended identification is less than ideal but frankly is just the nature of doing business and any number of niceties are lost by aggressively avoiding fingerprinting.

In software intentionally optimized to avoid any fingerprinting however it is a vulnerability.

The distinction being that fingerprinting in general is a less than ideal side effect that gives you a minor loss in privacy but in something like Tor Browser that fingerprinting can be life or death for a whistleblower, etc. It's the distinction between an annoyance and an execution.

NoahZuniga 5 hours ago | parent | prev | next [-]

The real reason is that fingerprint.com's selling point is tracking over longer periods (months, their website claims), and this doesn't help them with that.

stackghost an hour ago | parent | prev | next [-]

All fingerprinting is a vulnerability, unless the client opts-in.

lyu07282 5 hours ago | parent | prev [-]

So it's the criminal that convinced themselves they are the good guys, I didn't expect that one. You are a malware company get a grip.

celsoazevedo 5 hours ago | parent | next [-]

Would you prefer that they kept this for themselves instead of disclosing it?

I get criticizing their business and what they do wrong, but doesn't seem right to criticizing them for doing the right thing.

trinsic2 4 hours ago | parent | next [-]

It means they are suspect. I think its right to be wary of motives if they are involved in the very thing they aim to bring awareness too. Questions arise in my mind as to why they would do something like this in the first place.

Its been my experience that the general public doesn't seem to follow patterns and instead focus on which switch is toggled at any given moment for a company's ethical practices. This is the main reason why we are constantly gamed by orgs that have a big picture view of crowd psychology.

celsoazevedo 3 hours ago | parent [-]

I don't trust them more because of this and maybe they've disclosed it for the wrong reasons, like not allowing a competitor to use it when they don't, but at the end of the day they did disclose a serious issue, and that's good for users.

I understand where you're coming from, by the way, but sometimes the worst person you know does the right thing and it's not fair to criticize them for doing it (you could say nothing, don't have to change your opinion about them, etc). We also don't want someone to go "if I'm bad no matter what I do, then might as well make some money with this" and sell the exploit.

lyu07282 3 hours ago | parent | prev [-]

What are you even saying? It's like getting upset at somebody who criticizes a criminal because they once helped some grandma across the street. I'm not upset at the criminal because they helped a grandma across the street obviously that's not the fucking point.

celsoazevedo 2 hours ago | parent | next [-]

I'm not upset, I just don't think we should criticize someone for doing something good. Maybe they're a terrible org, maybe they deserve criticism most of the time, but not in this instance.

It's not like you can't point out that they did a good deed, but that they're still in the shitty business of fingerprinting users.

Also, if people only get the stick no matter what they do, then eventually some will embrace the dark side and at least make money out of it. And that's not good for you.

lyu07282 an hour ago | parent [-]

The inverse is also true, letting them whitewash their image by pretending they care about your privacy and seek to protect you will be good for their public relations, but only if we let them. I refuse to be this gullible and run to their defense for no apparent reason.

Vinnl 2 hours ago | parent | prev [-]

It's more like criticising a criminal when they are helping some grandma across the street, thereby treating them more harshly than the criminals that don't do that.

(Also known as the "Copenhagen Interpretation of Ethics": https://gwern.net/doc/philosophy/ethics/2015-06-24-jai-theco... )

somerset 3 hours ago | parent | prev [-]

Responsible disclosure and commercial fingerprinting aren't contradictory.

lyu07282 3 hours ago | parent [-]

Do you seriously not see the contradiction? I consider all methods that enable fingerprinting, as vulnerabilities that browsers should fix, if we did that it would destroy their business. On top of that a company like that shouldn't be allowed to exist in the first place as a legal entity and it very likely is already operating in a legal grey area in a lot of places. It's the difference between a security company that provides IDS signatures as a service that does responsible disclosure vs. a malware company that offers 0click exploits. Would you praise the NSO group if they did responsible disclosure?

Fucking HN sheep

flufluflufluffy 2 hours ago | parent | next [-]

If you take their claim that they don’t use vulnerabilities in their products as true, then I don’t see a contradiction. If it isn’t true, then obviously there is a contradiction.

But your considering of all methods that enable fingerprinting as vulnerabilities is your own opinion. There are definitely measurable signals that are based on a user’s behavior, rather than data exposed by the browser itself.

kube-system an hour ago | parent | prev [-]

It's a little bit disingenuous to call intentional wont-fix features "vulnerabilities".

hrimfaxi 6 hours ago | parent | prev [-]

They probably are not relying on it and disclosure means others can't either.