| ▲ | SilasX 2 days ago |
| Yeah, I'm not sure if I'm missing something, and I don't like to defend FB, but ... AIUI, they have a system for using data they receive to target ads. They tell people not to put sensitive data in it. Someone does anyway, and it gets automatically picked up to target ads. What are they supposed to do on their end? Even if they apply heuristics for "probably sensitive data we shouldn't use"[1], some stuff is still going to get through. The fault should still lie with the entity that passed on the sensitive data. An analogy might be that you want to share photos of an event you hosted, and you tell people to send in their pics, while enforcing the norm, "oh make sure to ask before taking someone's photo", and someone insists that what they sent in was compliant with that rule, when it wasn't. And then you share them. [1] Edit: per your other comment, they indeed had such heuristics: https://news.ycombinator.com/item?id=44901198 |
|
| ▲ | jlarocco 2 days ago | parent | next [-] |
| It doesn't work like that, though. Companies don't get to do whatever they want just because they didn't put any safegaurds in place to prevent illegally using the data they collected. The correct answer is to look at the data and verify it's legal to use. I might be sympathetic of a tiny startup who has increased costs, but it's a cost of doing business just like anything else. And Facebook has more than enough resources to put safegaurds in place, and they definitely should have known better by now, so they should get punished for not complying. |
| |
| ▲ | SilasX 2 days ago | parent [-] | | > The correct answer is to look at the data and verify it's legal to use. So repeal Section 230 and require every site to manually evaluate all content uploaded for legality before doing anything with it? If it’s not reasonable to ask sites to do that, it’s not reasonable to ask FB to do the same for data you send them. Your position seems to vary based on how big/sympathetic the company in question is, which is not very even-handed and implicitly recognizes the burden of this kind of ask. | | |
| ▲ | const_cast 16 hours ago | parent [-] | | Not before doing anything with it, just before processing it for specific business use cases like targeting. Running a forumn is fine, and I don't care if someone inputs a fake SSN on a forumn post. I DO care if someone inputs a fake SSN on a financial form I provided, and it is actually my responsibility to prevent that. That's what KYC is and more. |
|
|
|
| ▲ | mlyle 2 days ago | parent | prev | next [-] |
| The problem is, the opposite approach is... "We're scot free, because we told *wink* people to not sell us sensitive data. We get the benefit from it, and we make it really easy for people to sign up and get paid to give us this data that we 'don't want.'" Please don't sell me cocaine *snifffffffff* > The fault should still lie with the entity that passed on the sensitive data. Some benefits to making it be both: * Centralize enforcement with more knowledgable entities * Enforce at a level where the misdeeds can actually be identified and have scale, rather than death from a million cuts * Prevent the central entity from using deniable proxies and cut-throughs to do bad things This whole notion that we want so much scale, and that scale is an excuse for not paying attention to what you're doing or exercising due diligence, is repugnant. It pushes some cost down but also causes a lot of social harm. If anything, we should expect more ownership and responsibility from those with concentrated power, because they have more ability to cause widescale harm. |
| |
| ▲ | gruez 2 days ago | parent [-] | | >"We're scot free, because we told wink people to not sell us sensitive data. We get the benefit from it, and we make it really easy for people to sign up and get paid to give us this data that we 'don't want.'" >Please don't sell me cocaine snifffffffff Maybe there's something in discovery that substantiates this, but so far as I can tell there's no "wink" happening, officially or unofficially. A better analogy would be charging amazon with drug distributing because some enterprising drug dealer decided to use FBA to ship drugs, but amazon was unaware. | | |
| ▲ | mlyle a day ago | parent [-] | | Facebook gets a fiscal benefit when the counterparty to the contract breaks the rule, and so has no incentive to enforce it (rather, the opposite). Unless, of course, Facebook is held accountable for not enforcing it. |
|
|
|
| ▲ | bee_rider 2 days ago | parent | prev [-] |
| I don’t like the analogy because “hosting an event” is a fuzzy thing. If you are hosting an event with friends you might be able to rely on the shared values of your friends and the informal nature of the thing to enforce this sort of norm. If you are a business that host events and your business model involves photos of the event, you should have a professional approach to knowing if people consented to have their photos shared, depending on the nature of the venue. At this point it is becoming barely an analogy though. |
| |
| ▲ | SilasX 2 days ago | parent [-] | | >I don’t like the analogy because “hosting an event” is a fuzzy thing. If you are hosting an event with friends you might be able to rely on the shared values of your friends and the informal nature of the thing to enforce this sort of norm. You can't, though -- not perfectly, anyway. Whatever the informal norms, there are going to be people who violate them, and so the fault shouldn't pass on to you when you don't know someone is doing that. If anything, the analogy understates how unreasonable it is to FB, since they had an explicit contractual agreement for the other party not to send them sensitive data. And as it stands now, websites aren't expected to pre-filter for some heuristic on "non-consensual user-uploaded photographs" (which would require an authentication chain), just to take them down when informed they're illegal ... which FB did (the analog of) here. >If you are a business that host events and your business model involves photos of the event, you should have a professional approach to knowing if people consented to have their photos shared, depending on the nature of the venue. I'm not sure that's the standard you want to base this argument on, because in most cases, the "professional approach" amounts to "if you come here at all, you're consenting to be photographed for publication, take it or leave it lol". FB had a stronger standard than this. | | |
| ▲ | bee_rider 2 days ago | parent [-] | | > I'm not sure that's the standard you want to base this argument on, because in most cases, the "professional approach" amounts to "if you come here at all, you're consenting to be photographed for publication, take it or leave it lol". FB had a stronger standard than this. It depends on the event and the nature of the venue. But yes, it is a bad analogy. For one thing Facebook is not an event with clearly delineated borders. It should naturally be given much higher scrutiny than anything like that. |
|
|