| ▲ | 999900000999 a day ago |
| Users should always avoid sharing sensitive data. A lot of AI products straight up have plan text logs available for everyone at the company to view. |
|
| ▲ | ameliaquining a day ago | parent | next [-] |
| Which ones? Do you just mean tiny startups and side projects and the like or is this a problem that major model providers have? |
|
| ▲ | pyman a day ago | parent | prev [-] |
| It's not just about sensitive data like passwords, contracts, or IP. It's also about the personal conversations people have with ChatGPT. Some are depressed, some are dealing with bullying, others are trying to figure out how to come out to their parents. For them, this isn't just sensitive, it's life-changing if it gets leaked. It's like Meta leaking their WhatsApp messages. I really hope they fix this bug and start taking security more seriously. Trust is everything. |
| |
| ▲ | milkshakes a day ago | parent [-] | | maybe you should stop trusting random people on the internet making extraordinary claims without proof then? | | |
| ▲ | baby_souffle a day ago | parent | next [-] | | Isn't "assume vulnerable" The only prudent thing to do here? | | |
| ▲ | milkshakes a day ago | parent | next [-] | | everything is vulnerable. the question is, has this researcher demonstrated that they have discovered and successfully exploited such a vulnerability. what exactly in this post makes you believe that this is the case? | |
| ▲ | refulgentis a day ago | parent | prev [-] | | No? Yes? Mu? After some hemming and hawing, my most cromulent thought is, having good security posture isn't synonymous with accepting every claim you get from the firehose |
| |
| ▲ | 999900000999 a day ago | parent | prev [-] | | https://arstechnica.com/tech-policy/2025/07/nyt-to-start-sea... | | |
| ▲ | ameliaquining a day ago | parent [-] | | This is going to be subject to the legal discovery process with the usual safeguards to prevent leaks; in particular, the judge will directly supervise the decision of who needs access to these logs, and if someone discloses information derived from them for an improper purpose, there's a very good chance they'll go to jail for contempt of court, which is much more stringent than you can usually expect for data privacy. You can still quite reasonably be against it, but you cannot reasonably call it "plain text logs available for everyone at the company to view". |
|
|
|