▲ | Australian bank gives out customer phone to another customer by asking ChatGPT(hey.paris) | |||||||||||||||||||||||||
9 points by parisidau a day ago | 6 comments | ||||||||||||||||||||||||||
▲ | red369 a day ago | parent | next [-] | |||||||||||||||||||||||||
I think this is terrible, but I suppose I think it is less bad if ChatGPT didn’t get the phone number from the bank’s files/data. If ChatGPT just provided it from public info/the training set, am I wrong to think that isn’t as bad? Just for clarity, I think it’s not a good idea to use LLMs if you care whether the answer is right or wrong. So this is a terrible use case. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||
▲ | femto a day ago | parent | prev | next [-] | |||||||||||||||||||||||||
Good luck getting any recompense. CBA disclosed your phone number. I'm aware of a company that disclosed 3000 high resolution colour passport scans, along with all personal details from a travel booking website. About half of the records were for school children. No one was notified that their data was leaked. Diddly squat happened to that company. | ||||||||||||||||||||||||||
▲ | legacynl a day ago | parent | prev [-] | |||||||||||||||||||||||||
Wow this is terrible |