| ▲ | matthewdgreen 10 hours ago |
| One of the major problems with on-device identifiers is that they must by tied tightly to devices, due to the risks of cloning. This is particularly true for privacy-preserving identifiers. That's why device attestation is so important, because you can't ensure that identity (keys) are locked to a device unless you can verify that the hardware prevents users from extracting keys. The worst part of this is that motivated criminals will certainly figure out how to extract those keys and use them for fraud; it's open-source and open computing that will be destroyed by this. |
|
| ▲ | subscribed 10 hours ago | parent | next [-] |
| Yeah, but they aren't. Google certifies devices unpatched for the last 10 years, rooted, riddled with the malware, because the keys have leaked. Google knows and still sells the lie. But you should know better. Google is not selling the actual security, it's just protecting its business. |
| |
| ▲ | matthewdgreen 10 hours ago | parent [-] | | Google's business is advertising. Right now they don't care whether your phone is "authentic" or secure, because it doesn't cost them money. As AI-enabled bot fraud rises, they will care. Fighting this requires identifying human beings, and that requires trusted devices to be associated with human beings. We're in the foothills still, but look forward and up at where adtech is going. | | |
| ▲ | bronson 4 hours ago | parent [-] | | How is a trusted device associated with a human being? I'm pretty sure the walls of hundreds of bot phones are running trusted Android. | | |
| ▲ | matthewdgreen 3 hours ago | parent [-] | | By attaching your government ID to a (single) phone and verifying the human owns it by checking biometrics. You can try this today if you live in one of several US states and have a recent iOS/Android phone. This doesn't stop one real person from attaching their ID to one real phone and then abusing it for botting, but (if implemented well) it limits you to one-real-ID-one-bot-phone. |
|
|
|
|
| ▲ | EmbarrassedHelp 10 hours ago | parent | prev | next [-] |
| Don't hardware identifiers also mean that Google can blacklist your device from vast portions of the internet whenever they feel like it? |
| |
| ▲ | frm88 2 hours ago | parent [-] | | Do we know whether this is possible? I'm clueless when it comes to phones, so this is a genuine question. |
|
|
| ▲ | lxgr 10 hours ago | parent | prev | next [-] |
| Only if you need to have the entire application behavior (or at least some trusted confirmation) attested, right? Otherwise, an external USB dongle, tapping a contactless smartcard on a phone etc. could do just fine. |
| |
| ▲ | matthewdgreen 10 hours ago | parent [-] | | Sure, but then you need to receive an attestation from that external dongle, and/or pre-provision it with an identity (like a national ID smartcard.) It might work in places that distribute this hardware, but it's a crummy UX. I expect that the goal of these systems is to make ID verification a requirement for most routine device usage, sadly, and external dongles will crap that up from a UX perspective. There is also the problem that most external hardware is less secure than things like Apple's SEP. (But on the other hand, probably more secure than the long tail of cheap Android phones, which use virtualization rather than real hardware.) | | |
| ▲ | lxgr 10 hours ago | parent [-] | | > then you need to receive an attestation from that external dongle, and/or pre-provision it with an identity (like a national ID card.) That's how it works in Germany: You tap your national ID card (as a citizen) or eID card (as a non-citizen) on any NFC-capable iPhone or Android device. I personally much prefer that solution over one that requires a specifically trusted device. The big gap is trusted user confirmation, though: Users need to see what they sign by tapping their card, and then you're usually back to some form of attestation. Practically, they also completely botched the rollout; literally everyone I know managed to somehow lock themselves out of their card at the first attempted use (assuming they've even bothered to set it up). | | |
| ▲ | matthewdgreen 10 hours ago | parent [-] | | The adtechs want this so they can verify the "human" quality of each user. To do this, they don't want people tapping their government ID on their phones every single time they sign up for Reddit or receive an advertisement. Hence (some derivative of) the ID has to be stored on-device to make the browsing/usage experience seamless. | | |
| ▲ | lxgr 9 hours ago | parent [-] | | Fair enough, I can see why not. To me, it seems like just the right amount of friction, and user expectations can work in favor of privacy here: People will hopefully refuse to tap their ID on their phone for a service where they want to remain completely anonymous, even if the protocol technically might support anonymous assertions. |
|
|
|
|
|
| ▲ | 10 hours ago | parent | prev [-] |
| [deleted] |