| ▲ | lukeschlather 4 hours ago | ||||||||||||||||||||||||||||||||||
I really don't understand how this is legal. I guess Facebook maybe doesn't actually have any compliance requirements in the USA, but time series screenshots of any SRE's screen are going to contain data that should not be stored by some data vacuum. I know Meta has a reputation for shitty data handling practices and US regulations are light compared to Europe, but how are they planning on securing passwords, encryption keys, PII, etc. ? Can employees turn this off at their discretion? What happens if someone forgets to turn it off before they cat the companywide ssh root private key? Even setting aside legality, someone with access to this training data would have what sounds like an unacceptably broad level of access to company systems unless Facebook wants to get hacked. | |||||||||||||||||||||||||||||||||||
| ▲ | kube-system 3 hours ago | parent | next [-] | ||||||||||||||||||||||||||||||||||
This is legal for most businesses under US law, especially on company devices. And unfortunately not unheard of. Compliance with this data is typically handled in the same way you'd handle any data access situation -- by restricting access to the screencaps to a specific group of people. Not that I support it -- but typically companies don't do this in spite of security concerns, they do it to address security concerns. But of course, what meta is doing sounds like a different situation. It sounds like they want to make a model that replaces part of their workforce. | |||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||
| ▲ | avaer 3 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||||||||
This data is going to get leaked in a breach. It will be used against you in a court of law. It will be used for training and (regardless of what anyone says) will be used to fire you once the AI can do your job. And when all of the above happens Meta will be absolved of any responsibility. I don't understand how it's legal either. I guess we need laws against it yesterday. | |||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||
| ▲ | numpad0 3 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||||||||
All psychological experiments that loosely relates to Web became default legal when A/B tests became normalized after Google started it. It is not something that may be covered by blanket waivers. It's something that require participation under free will and independent review boards and such. For every single one of those little tests. The cat is out of the bag, but that doesn't mean it's a non-issue. | |||||||||||||||||||||||||||||||||||
| ▲ | 4 hours ago | parent | prev [-] | ||||||||||||||||||||||||||||||||||
| [deleted] | |||||||||||||||||||||||||||||||||||