| ▲ | United857 3 days ago |
| That's rather surprising about the accessing user data bit. When I was at Meta, the quickest way to get fired as an engineer was to access user data/accounts without permission or business reason. Everything was logged/audited down to the database level. Can't imagine that changing and the rules are taught very early on in the onboarding/bootcamp process. |
|
| ▲ | lysace 3 days ago | parent | next [-] |
| That part of the complaint is specifically about 1500 ”WhatsApp engineers”. Different culture from the blue app, or whatever they call it? |
|
| ▲ | MrDresden 3 days ago | parent | prev | next [-] |
| But the crucial bit to know here would be if that data was readable in anyway in case it was accessed? Personally it doesn't matter if there are auditing systems in place, if the data is readable in any way, shape or form. |
| |
| ▲ | dijit 3 days ago | parent [-] | | is that really true? I haven’t touched a lot of these cyber security parts of industry: especially policies for awhile… … but I do recall that auditing was a stronger motivator than preventing. There were policies around checking the audit logs, not being able to alter audit logs and ensuring that nobody really knew exactly what was audited. (Except for a handful of individuals of course.) I could be wrong, but “observe and report” felt like it was the strongest possible security guarantee available inside the policies we followed (PCI-DSS Tier 1). and that prevention was a nice to have on top. | | |
| ▲ | dns_snek 3 days ago | parent | next [-] | | As a customer I'm angry that businesses get to use "hope and pray" as their primary data protection measure without being forced to disclose it. "Motivators" only work on people who value their job more than the data they can access and I don't believe there's any organization on this planet where this is true for 100% of the employees, 100% of the time. That strategy doesn't help a victim who's being stalked by an employee, who can use your system to find their new home address. They often don't care if they get fired (or worse), so the motivator doesn't work because they aren't behaving rationally to begin with. | | |
| ▲ | blululu 3 days ago | parent [-] | | This really isn’t fair. It is not simply hope and pray: it is a clearly stated/enforced deterrent that anyone who violates the policy will be terminated. You lose your income and seriously harm your future career prospects. This is more or less the same policy that governments hold to bad actors (crime happens but perpetrators will be punished).
I get that it is best to avoid the possibility of such incidents but it is not always practical and a strong punishment mechanism is a reasonable policy in these cases. | | |
| ▲ | dns_snek 3 days ago | parent [-] | | You don't think it's fair to expect a trillion-dollar business to implement effective technical measures to stop rogue (or hacked!) employees from accessing personal information about their users? I'm not talking about small businesses here, but large corporations that have more than enough resources to do better than just auditing. > crime happens but perpetrators will be punished Societies can't prevent crime without draconian measures that stifle all of our freedoms to an extreme degree. Corporations can easily put barriers in place that make it much more difficult (or impossible) to gain unauthorized access to customer information. The entire system is under their control. |
|
| |
| ▲ | MrDresden 3 days ago | parent | prev [-] | | Facebook/Meta has shown time and time again that it can't be trusted with data privacy, full stop. No amount of internal auditing, externally verified and stamped with approval for following ISO standards theater will change the fact that as a company it has firebombed each and every bridge that was ever available to it, in my book. If the data has the potential to be misused, that is enough for me to equate it as not secure for use. |
|
|
|
| ▲ | imiric 3 days ago | parent | prev | next [-] |
| Whatever Meta says publicly about this topic, and whatever its internal policies may be, directly contradicts its behavior. So any attempt to excuse this is nothing but virtue signalling and marketing. The privacy violations and complete disregard for user data are too numerous to mention. There's a Wikipedia article that summarizes the ones we publicly know about. Based on incentives alone, when the company's primary business model is exploiting user data, it's easy to see these events as simple side effects. When the CEO considers users of his products to be "dumb fucks", that culture can only permeate throughout the companies he runs. |
| |
| ▲ | testdelacc1 3 days ago | parent [-] | | There’s a meaningful difference in a company wanting to exploit user data to enrich itself and allowing employees to engage in voyeurism. The latter doesn’t make the company money, and therefore can be penalised at no cost. Your comment talks about incentives, but you haven’t actually made a rational argument tying actual incentives to behaviour. | | |
| ▲ | imiric 3 days ago | parent | next [-] | | My point is that it would be naive to believe that a company whose revenue depends on exploiting user data has internal measures in place to ensure the safe handling of that data. In fact, their actions over the years effectively prove that to not be the case. So whatever they claim publicly, and probably to their low-level employees, is just marketing to cover their asses and minimize the impact to their bottom line. | | |
| ▲ | testdelacc1 2 days ago | parent [-] | | What would be the cost of setting safeguards and firing employees that cross the line? Feel like an access control system would be fairly easy to build and firing employees is not a huge deal nowadays. You claim it’s all talk, but it’s not much more effort to walk the walk. It doesn’t hurt profits to do it. |
| |
| ▲ | const_cast 3 days ago | parent | prev [-] | | There is actually no difference, only a difference in intent. The problem is similar to that of government efforts to ban encryption: if you have a backdoor, everyone has a backdoor. If Meta is collecting huge amount of user info like candy (they are) and using it for business purposes (they are), then necessarily those employees implementing those business purposes can do that, too. You can make them pinky promise not to. That doesn't do anything. Meta has a similar problem with stalking via Ring camera. You allow and store live feeds of every Ring camera? News flash: your employees can, too! They're gonna use that to violate your customers! |
|
|
|
| ▲ | mgh2 3 days ago | parent | prev | next [-] |
| Do you have proof? |
| |
| ▲ | YouWhy 3 days ago | parent | next [-] | | To the extent a random person's evidence on the Internet amounts to proof: From people at Facebook circa 2018, I know that end user privacy was addressed at multiple checkpoints -- onboarding, the UI of all systems that could theoretically access PII, war stories about senior people being fired due to them marginally misunderstanding the policy, etc. Note that these friends did not belong to WhatsApp, which was at that time a rather separate suborg. | |
| ▲ | Jenk 3 days ago | parent | prev [-] | | Does Attaullah Baig? | | |
|
|
| ▲ | thunderfork 3 days ago | parent | prev | next [-] |
| [dead] |
|
| ▲ | aprilthird2021 3 days ago | parent | prev [-] |
| Everything is logged, but no one really cares, and the "business reasons" are many and extremely generic. That being said, maybe I'm dumb but I guess I don't see the huge risk here? I could certainly believe that 1500 employees had basically complete access with little oversight (logging and not caring isn't oversight imo). But how is that a safety risk to users? User information is often very important in the day to day work of certain engineering orgs (esp. the large number of eng who are fixing things based off user reports). So that access exists, what's the security risk? That employees will abuse that access? That's always going to be possible I think? |
| |
| ▲ | simmerup 3 days ago | parent [-] | | You really don't see the safety risk? If you have a sister,imagine her being stalked by an employee? If you have crypto, imagine an employee selling your information to a third party? | | |
| ▲ | aprilthird2021 2 days ago | parent [-] | | Yes but an employee will always be able to do those things because some employees, even a large number of some employees, need access to user accounts and data for legitimate reasons, and since the only workable way is to track and punish later (cannot run the company if every user access needs human approval at the moment), it's always a risk |
|
|