▲ | red369 2 days ago | ||||||||||||||||
I think this is terrible, but I suppose I think it is less bad if ChatGPT didn’t get the phone number from the bank’s files/data. If ChatGPT just provided it from public info/the training set, am I wrong to think that isn’t as bad? Just for clarity, I think it’s not a good idea to use LLMs if you care whether the answer is right or wrong. So this is a terrible use case. | |||||||||||||||||
▲ | parisidau 2 days ago | parent [-] | ||||||||||||||||
I would argue the source of the information doesn’t matter. The bank disclosed a phone number of a customer to another customer. | |||||||||||||||||
|