Remix.run Logo
IncreasePosts 3 days ago

You're interacting with hundreds of systems whose job it is to simply transit your information. Privacy there makes sense. However, you're also talking to someone on the other end of all those systems. Do you have a right to force the other person to keep your conversation private?

kouteiheika 3 days ago | parent | next [-]

An AI chatbot is not a person, and you're not talking to anyone; you're querying a (fancy) automated system. I fundamentally disagree that those queries should not be guaranteed private.

Here's a thought experiment: you're a gay person living in a country where being gay is illegal and results in a death penalty. You use ChatGPT in a way which makes your sexuality apparent; should OpenAI be allowed to share this query with anyone? Should they be allowed to store it? What if it inadvertently leaks (which has happened before!), or their database gets hacked and dumped, and now the morality police of your country are combing through it looking for criminals like you?

Privacy is a fundamental right of every human being; I will gladly die on this hill.

nine_k 3 days ago | parent [-]

If you are talking to a remote entity not controlled by you, you should assume that your communication is somehow accessible to whoever has internal access that other entity. That as well may be not the entity's legitimate owners, but law-breakers or law enforcement. So, no, not private by default, but only by goodwill and coincidence.

There's a reason why e.g. banks want to have all critical systems on premises, under their physical control.

yndoendo 3 days ago | parent | next [-]

How would consuming static information from a book versus a dynamic system that is book-esk be any different? You are using ML to help quickly categorize and assimilate information that spans multiple books, magazines, or other written medium. [0] [1]

Why do people speak of ML/AI as an entity when it is a tool like a microwave oven? It is a tool designed to give answers, even wrong ones when the question is nonsensical.

[0] https://www.ala.org/advocacy/advleg/federallegislation/theus...

[1] https://www.ala.org/advocacy/intfreedom/statementspols/other...

nine_k 3 days ago | parent [-]

The difference is simple: if there's another party present while you're doing this. If yes, assume that the other party has access to the information that passed through it. A librarian would know which books you asked for. A reading assistant would know what you wanted to be read or summarized. Your microwave might have an idea what are you going to eat, if you run the "sensor heating" program.

The consumption is "static" in your terms if you read a paper book alone, or if you access a publicly available web page without running any scripts, or sending any cookies.

yndoendo 3 days ago | parent [-]

Sorry, there is always a 3rd party involved in a library. The librarians are the ones that select which books to have on handle for consumption, same with a book store, or any source provider of books.

A person going to a library and consuming with out a check-out record, one must assume any book was consumed with in the collection. Only a solid record of a book be checked out creates a defined moment that is still anchored in confidentiality between the parties.

Unless that microwave sensor requires an external communication it is a closed system which does not communicate any information about what item was heated. The 3rd party would be the company the meal was purchased from.

A well designed _smart microwave_ would perform batch process updating and pull in a collection of information to create the automated means to operate. Never know when there could be an Internet outage or the tool might be placed were external communication is not a possible option.

A poorly designed system would require a back and forth communication. Yet it would be no different than a chief knowing what you order with limited information about you. Those systems have an inherent anonymity.

It is the processing record that can be exploited and a good organization would require a warrant or purge the information when it is no longer needed. Cash payment also improves the anonymity in that style of system preventing leaking personal information to anyone.

Why should a static book system like a library not be applied to any ML model since they are performing the same task and providing access to information in a collection? The system is poorly designed if confidently is not adhered by all parties.

Sounds like ML corporations want to make you the product instead of being used as a product. This is why I only respect open design models, from bottom up, that are run locally.

kouteiheika 3 days ago | parent | prev | next [-]

I am assuming that my communications are not private, but it doesn't change the fact that these companies should be held to a higher standard than that and those rights should be codified into the law.

BriggyDwiggs42 3 days ago | parent | prev [-]

That’s a rational and cautious assumption but there should also be regulations that render it less necessary placed upon companies large enough to shoulder the burden.

nine_k 3 days ago | parent [-]

The bodies that are in a position to effect such regulations are also the bodies that are interested in looking at your (yes, your) private communication. No, formally being a liberal democracy helps little, see PATRIOT Act, Chat Control, etc.

The only secure position for a company (provided that the company is not interested in reading your communication) is the position of a blind carrier that cannot decrypt what you say; e.g. Mullvad VPN demonstrated that it works. I don't think that an LLM hosting company can use such an approach, so...

BriggyDwiggs42 2 days ago | parent [-]

Yeah, I agree.

gspencley 3 days ago | parent | prev | next [-]

> Do you have a right to force the other person to keep your conversation private?

It depends. If you're speaking to a doctor or a lawyer, yes, by law they are bound to keep your conversation strictly confidential except in some very narrow circumstances.

But it goes beyond those two examples. If I have an NDA with the person I am speaking with on the other end of the line, yes I have the "right" to "force" the other person to keep our conversation private given that we have a contractual agreement to do so.

As far as OpenAI goes, I'm of the opinion that OpenAI - as well as most other businesses - have the right to set the terms by which they sell or offer services to the public. That means if they wanted a policy of "all chats are public" that would be within their right to impose as far as I'm concerned. It's their creation. Their business. I don't believe people are entitled to dictate terms to them, legal restrictions notwithstanding.

But in so far as they promise that chats are private, that becomes a contract at the time of transaction. If you give them money (consideration) with the impression that your chats with their LLM are private because they communicated that, then they are now contractually bound to honour the terms of that transaction. The terms that they subjected themselves to when either advertising their services or in the form of a EULA and/or TOS presented at the time of transaction.

sophacles 3 days ago | parent | prev | next [-]

In many circumstances yes.

When I'm talking to my doctor, or lawyer, or bank. When there's a signed NDA. And so on. There are circumstances where the other person can be (and is) obliged to maintain privacy.

One of those is interacting with an AI system where the terms of service guarantee privacy.

IncreasePosts 3 days ago | parent [-]

Yes, but there are also times when other factors are more important than privacy. If you tell your doctor you're going to go home and kill your wife, they are ethically bound to report you to the police, despite your right of doctor patient confidentiality. Which is similar to what openai says here about "imminent harm"

citizenpaul 3 days ago | parent | prev [-]

> Do you have a right to force the other person to keep your conversation private?

In most of the USA that already is the law.