| ▲ | Neywiny 3 hours ago | |
That's what gets me too. It feels so easy. Just don't store the source data. They usually say they don't, they claim it's deleted quickly or never saved to disk at all or other stuff and its always lies. Just don't save it and there's no attack vector. Same thing with plaintext passwords. The amount of password leak attacks I've seen in recent years is 0 (though I'm sure they're still around). Stored IDs and PII for verification is uncountable | ||
| ▲ | techjamie an hour ago | parent | next [-] | |
I have two older coworkers in my workplace alone that had criminals make fraudulent unemployment claims in their names. Fortunately they noticed and stopped it quickly. Undoubtedly the information used came from data leaks. Now all these sites are implementing face scans, which would be perfect for such criminals to harvest and use AI to create fake videos of you saying anything that suits them. At what point does legislation step in, here? Because so far, it's only moved in the direction that worsens the issue. | ||
| ▲ | rawgreaze 2 hours ago | parent | prev [-] | |
The trick is that they delete the actual image of the face, but the embeddings (which can reconstruct the face anyway) are saved permanently onto their servers. You don't own your embeddings, and they never delete them. | ||