| ▲ | ljosifov 4 days ago |
| Excellent. What were they waiting for up to now?? I thought they already trained on my data. I assume they train, even hope that they train, even when they say they don't. People that want to be data privacy maximalists - fine, don't use their data. But there are people out there (myself) that are on the opposite end of the spectrum, and we are mostly ignored by the companies. Companies just assume people only ever want to deny them their data. It annoys me greatly, that I have no tick box on Google to tell them "go and adapt models I use on my Gmail, Photos, Maps etc." I don't want Google to ever be mistaken where I live - I have told them 100 times already. This idea that "no one wants to share their data" is just assumed, and permeates everything. Like soft-ball interviews that a popular science communicator did with DeepMind folks working in medicine: every question was prefixed by litany of caveats that were all about 1) assumed aversion of people to sharing their data 2) horrors and disasters that are to befall us should we share the data. I have not suffered any horrors. I'm not aware of any major disasters. I'm aware of major advances in medicine in my lifetime. Ultimately the process does involve controlled data collection and experimentation. Looks a good deal to me tbh. I go out of my way to tick all the NHS boxes too, to "use my data as you see fit". It's an uphill struggle. The defaults are always "deny everything". Tick boxes never go away, there is no master checkbox "use any and all of my data and never ask me again" to tick. |
|
| ▲ | koolba 4 days ago | parent | next [-] |
| > It annoys me greatly, that I have no tick box on Google to tell them "go and adapt models I use on my Gmail, Photos, Maps etc." I don't want Google to ever be mistaken where I live - I have told them 100 times already. As we’ve seen LLMs be able to fully regenerate text from their sources (or at least close enough), aren’t you the least bit worried about your personal correspondence magically appearing in the wild? |
| |
| ▲ | ljosifov 3 days ago | parent [-] | | I am a little bit worried, for sure. But I think that's small extra risk on my side, for small extra gain for me personally, but large extra gain for the wider group I belong to (ultimately - all of humanity) in the sense of working towards ameliorating the "tragedy of the commons". On the personal side. Given the LLM-s have not got the ground truth, everything is controlled hallucination, then - if the LLM tells you an imperfect version of my email or chat, you can never be sure if what the LLM told you is true, or not. So maybe you don't gain that much extra knowledge about me. For example, you can reasonably guess I'm typing this on the computer, and having coffee too. So if you ask the LLM "tell me a trivial story", and LLM comes back with "one morning, LJ was typing HN replies on the computer while having his morning coffee" - did you learn that much new about me, that you didn't know or could guess before? On the "tragedy of the commons" side. We all benefit immensely from other people sharing their data, even very personal data. Any drug discovery, testing, approval - relies on many people allowing their data to be shared. Wider context - living in a group of people, involves radiating data outwards, and using data other people emit towards myself (and others), to have a functioning society. The more advanced the society, the more coordination it needs to achieve the right cooperation-competition balance in the interactions between ever greater numbers of people. I think it's bad for me personally, and for everyone, that the "data privacy maximalists" had their desires codified in UK laws. My personal experience in the UK medical systems has been that the laws made my life worse, not better. Wrote here https://news.ycombinator.com/item?id=45066321 |
|
|
| ▲ | SantalBlush 4 days ago | parent | prev | next [-] |
| If only you were just giving them your own data. In reality, you're giving them data about your friends, relatives, and coworkers without their consent. Let's stop pretending there is any way to opt out by simply not using these companies' services; it isn't true. |
|
| ▲ | simonw 4 days ago | parent | prev | next [-] |
| If you have an API key for a paid service, would you be OK with someone asking ChatGPT or VS Code Copilot for an API key for that service and getting yours, which they then use to rack up bills that you have to pay? |
|
| ▲ | p3rls 4 days ago | parent | prev | next [-] |
| I think the real frustrating part is that they're using your data, scanning every driver's license etc that comes onto the google play store-- and there's still scammers etc using official google products that people catch everyday on twitter now that scambaiting is becoming a popular pastime. |
|
| ▲ | JohnMakin 4 days ago | parent | prev | next [-] |
| The fact you are not aware of abuse, or abuse has not yet happened to you, does not mean it isn't a problem for you. > The defaults are always "deny everything". This is definitely not true for a massive amount of things, I'm unsure how you're even arriving at this conclusion. |
| |
| ▲ | ljosifov 4 days ago | parent [-] | | Maybe in the US. In the UK, I have found obstacles to data sharing codified in the UK law frustrating. I'm reasonably sure some people will have died because of this, that would not have died otherwise. "Otherwise" case being - if they could communicate with the NHS, similarly (via email, whatsapp) to how they communicate in their private and professional lives. Within the UK NHS and UK private hospital care, these are my personal experiences. 1) Can't email my GP to pass information back-and-forth. GP withholds their email contact, I can't email them e.g. pictures of scans, or lab work reports. In theory they should have those already on their side. In practice they rarely do. The exchange of information goes sms->web link->web form->submit - for one single turn. There will be multiple turns. Most people just give up. 2) MRI scan private hospital made me jump 10 hops before sending me link, so I can download my MRI scans videos and pictures. Most people would have given up. There were several forks in the process where in retrospect could have delayed data DL even more. 3) Blood tests scheduling can't tell me back that scheduled blood test for a date failed. Apparently it's between too much to impossible for them to have my email address on record, and email me back that the test was scheduled, or the scheduling failed. And that I should re-run the process. 4) I would like to volunteer my data to benefit R&D in the NHS. I'm a user of medicinal services. I'm cognisant that all those are helping, but the process of establishing them relied on people unknown to me sharing very sensitive personal information. If it wasn't for those unknown to me people, I would be way worse off. I'd like to do the same, and be able to tell UK NHS "here are, my lab works reports, 100 GB of my DNA paid for by myself, my medical histories - take them all in, use them as you please." In all cases vague mutterings of "data protection... GDPR..." have been relayed back as "reasons". I take it's mostly B/S. Yes there are obstacles, but the staff could work around if they wanted to. However there is a kernel of truth - it's easier for them to not try to share, it's less work and less risk, so the laws are used as a cover leaf. (in the worst case - an alibi for laziness.) |
|
|
| ▲ | ardit33 4 days ago | parent | prev | next [-] |
| This is a problem for folks with sensitive data, and also for coorporate users who don't want their data being used for it due to all kinds of liability issues. I am sure they will have a coorporate carve out, otherwise it makes them unusuable for some large corps. |
|
| ▲ | otikik 4 days ago | parent | prev [-] |
| “Claude, please write and commit this as if you were ljosifov. Yes, please use his GitHub token, thank you” |