| ▲ | avaer 5 hours ago |
| This data is going to get leaked in a breach. It will be used against you in a court of law. It will be used for training and (regardless of what anyone says) will be used to fire you once the AI can do your job. And when all of the above happens Meta will be absolved of any responsibility. I don't understand how it's legal either. I guess we need laws against it yesterday. |
|
| ▲ | 2ndorderthought 5 hours ago | parent [-] |
| It doesn't have to get leaked. They can sell it and use it as another means to identify Internet users. Meta is pretty infamous for identifying, tracking, and understanding user behavior. We are kind of past the point where these companies care at all. If you think the push to add age verification to operating systems is an unrelated giggle I envy you. Something something Cambridge analytica. |
| |
| ▲ | kube-system 5 hours ago | parent [-] | | I think it's their employees here that have cause to be concerned, not internet users. Meta already has literally have billions of people's personal profiles and browsing history. I don't think screenshots of their SWE's IDEs is going to be useful for identifying internet users. | | |
| ▲ | 2ndorderthought 4 hours ago | parent [-] | | They could perfect it in house and then roll it out as a product. The way people type and use a mouse are pretty identifying especially when coupled with other things. I do agree screenshots themselves are less useful for that. | | |
| ▲ | kube-system 4 hours ago | parent [-] | | That doesn't make any sense. 1. Why use their employee's data to fingerprint input? They could do that to a billion+ of their users instead. 2. Input fingerprinting is multi-decades old science, there are already production products that do this. |
|
|
|