| ▲ | pavel_lishin 5 days ago |
| Good lord, why would they store those drivers' license images for an instant longer than it took to verify their users? |
|
| ▲ | jsrozner 5 days ago | parent | next [-] |
| This. Appropriate regulation should make this an offense punishable by a large fine. There is almost no consequence to companies for bad practices. Ideally you'd see fines in the 10%s of revenue. In egregious cases (gross negligence) like this, you should be able to go outside the LLC and recoup from equity holders' personal assets. Alas, if only we had consumer protections. |
| |
| ▲ | dannyphantom 5 days ago | parent | next [-] | | Absent broader regulation, we all know that apps like Tea depend HEAVILY on user trust. However, I am a bit concerned users either won't fully grasp the severity of this breach or won't care enough and end up sticking with the app regardless. A somewhat embarrassing but relevant example: my friends and I used Grindr for years (many still do), and we remained loyal despite the company's terrible track record with user data, privacy, and security as there simply wasn't (and still isn't) a viable alternative offering the same service at the expected level. It appears Tea saw a pretty large pop in discussion across social channels over the last few days so I'm pretty hopeful this will lend itself to widespread discussion where the users can understand just how poorly this reflects on the company and determine if they want to stick around or jump ship. | | | |
| ▲ | ytpete 5 days ago | parent | prev | next [-] | | Or maybe require them to prominently disclose the breech to all current and future users on the app main screen for some period of time afterward (a year or two?). Sort of like the health-code inspection ratings posted in restaurant windows. That cuts to the issue some other comments have pointed out, that user trust is really their most important capital – and with short attention spans and short news cycles, it may rebound surprisingly fast. | |
| ▲ | hdgvhicv 5 days ago | parent | prev | next [-] | | Companies, especially American ones, see data as an asset, rather than a liability. The GDPR in Europe attempts to reset this but it’s still an uphill battle | |
| ▲ | dabockster 5 days ago | parent | prev [-] | | > Appropriate regulation should make this an offense punishable by a large fine. And some kind of legal penalty for the engineers as well. Just fining the company does nothing to change the behavior of the people who built it in the first place. | | |
| ▲ | ryandrake 5 days ago | parent | next [-] | | I would at least love to see a public postmortem. What was the developer's rationale for storing extremely personal user data unencrypted, in a publicly facing database? How many layers of management approved storing extremely personal user data unencrypted, in a publicly facing database? What amount of testing was done that failed to figure out that extremely personal user data was stored unencrypted, in a publicly facing database? | | |
| ▲ | ohdeargodno 5 days ago | parent | next [-] | | >What was the developer's rationale for storing extremely personal user data unencrypted, in a publicly facing database? https://www.teaforwomen.com/about
>With a proven background leading product development teams at top Bay Area tech companies like Salesforce and Shutterfly, Sean [Cook, creator of Tea] leveraged his expertise building innovative technology to create a game-changing platform that prioritizes women’s safety If you're lucky, a clown vibe coded this trash. If you're unlucky, he paid someone to do so, and despite his proven background about leading top Bay Area companies, didn't even think to check a single time. The CEO is directly responsible for this. | | |
| ▲ | ryandrake 5 days ago | parent [-] | | Wow, so the entire company is a Founder and a Social Media Director?? > With a proven background leading product development teams at top Bay Area tech companies like Salesforce and Shutterfly, Sean [Cook, creator of Tea] leveraged his expertise building innovative technology Blah blah blah blah blah... Just goes to show that you can write all sorts of powerful sounding words about yourself on your About page, but it doesn't say anything about your actual competence. I mean, I don't have a "proven background leading product development teams" but I sure as shit wouldn't make obvious amateur-level mistakes like this if I ever did a startup. |
| |
| ▲ | ytpete 5 days ago | parent | prev [-] | | Requiring a 3rd-party auditor perform a postmortem whose results are posted publicly might be an interesting regulatory approach to this. Companies get shamed for their mistakes, and also the rest of the industry learns more about which practices are safe and which are dangerous. A bit like NTSB investigation reports, for example. |
| |
| ▲ | chemeng 5 days ago | parent | prev [-] | | In the US, professional certifications (PE, Bar, USMLE, CPA) exist to partially solve this problem when the certification is required to perform work legally. These are typically required in industries where lives and livelihoods of individuals and the public are at risk based on the decisions of the professional. Joining in with some other comments on this thread, if the stamp of a certified person was required to submit/sign apps with more than 10K or 100K users and came with personal risk and potential loss of licensure, I imagine things would change quickly. I'm personally not for introducing more gatekeeping and control over software distribution (Apple/Google already have too much power). Also not sure how you'd make it work in an international context, but would be simple to implement for US based companies if Apple/Google wanted to tackle the problem. I think the broader issue is that we as a society don't see data exposure or bad development practices as real harm. However, exposing the addresses and personal info of people talking about potentially violent, aggressive or unsafe people seems very dangerous. |
|
|
|
| ▲ | duxup 5 days ago | parent | prev | next [-] |
| They shouldn't, but it appears to be a gossip app where by design they're also storing photos taken of other people (permission or not) and gossip about them... They don't seem to value privacy. |
|
| ▲ | Proofread0592 5 days ago | parent | prev | next [-] |
| I am just making a wild guess with no evidence to back it up, but I have a question and a potential answer: How was this app going to monetize? I'm guessing by selling user data, namely drivers license info to phone number. |
|
| ▲ | Mountain_Skies 5 days ago | parent | prev | next [-] |
| According to another media report, the approval queue for new account verification was seventeen hours long. It's possible what the 4channers got was that approval queue. |
| |
|
| ▲ | hbn 5 days ago | parent | prev | next [-] |
| This is what vibe coding gets us! |
| |
| ▲ | ytpete 5 days ago | parent | next [-] | | Not a fan of the "vibe coding" hype, but is there any evidence that this app was built that way? | |
| ▲ | GoatInGrey 5 days ago | parent | prev [-] | | The cynical part of me feels like certain employees had uncontrolled access to the user data. There would be a morbid irony in the idea of a tool marketed as increasing safety for women actually being a honeypot operation to accumulate very sensitive personal information on those very women. | | |
| ▲ | throwawayq3423 5 days ago | parent [-] | | Honestly it doesn't matter that they didn't have that additional nefarious intent their incompetence and carelessness drove to the same result. |
|
|
|
| ▲ | DanHulton 5 days ago | parent | prev | next [-] |
| We really need to get used to treating PII as poison, storing as little of it as possible, and getting rid of it as soon as we're done with it. |
| |
| ▲ | sigwinch 4 days ago | parent [-] | | I think we can treat it as currency. All nine letters of my SSN, for example, I’ll allow you to store those if in return I get to store the name of the CEO’s boat. |
|
|
| ▲ | lm28469 4 days ago | parent | prev [-] |
| People keep bitching about EU's GDPR but most of it is about sensible things like not storing sensitive data forever just because you can. Nothing happens to these companies if there are no laws to keep them accountable, most companies I've worked with store everything forever because it's cheap and you never know what might become useful or what you might be able to sell to third parties later |