Remix.run Logo
alphazard 5 days ago

I expect we will continue to see the big AI companies pushing for privacy protections. Sam Altman made a comparison to attorney-client privilege in an interview. There is a significant hold out to using these things as fully trusted personal assistants or personal knowledge bases because of the lack of privacy.

The only real solution is locally running models, but that goes against the business model. So instead they will seek regulation to create privacy by fiat. Fiat privacy still has all the same problems as telling your therapist that you killed someone, or keeping your wallet keys printed out on paper in a safe. It's dependent on regulations and definitions of greater good that you can't control.

dataviz1000 5 days ago | parent | next [-]

> but that goes against the business model.

Not if you are selling hardware. If I was Apple, Dell, or Lenovo, I would be pushing for local running models supporting Hugging Face while I full speed developed systems that can do inference locally.

alphazard 5 days ago | parent | next [-]

Local models do make a lot of sense (especially for Apple), but it's tough to figure out a business model that would cause a company like OpenAI to distribute weights they worked so hard to train.

Getting customers to pay for the weights would be entirely dependent on copyright law, which OpenAI already has a complicated relationship with. Quite the needle to thread: it's okay for us to ingest and regurgitate data with total disregard for how it's licensed, but under no circumstances can anyone share these weights.

ronsor 5 days ago | parent | next [-]

> Getting customers to pay for the weights would be entirely dependent on copyright law

That's assuming weights are even covered by copyright law, and I have a feeling they are not in the US, since they aren't really a "work of authorship"

Juliate 5 days ago | parent | prev | next [-]

> it's tough to figure out a business model that would cause a company like OpenAI to distribute weights they worked so hard to train.

It sounds a lot like the browsers war, where the winning strategy had been to aggressively push (for free, which was rather uncommon then) one's platform, in the aim of market dominance for later benefits.

dataviz1000 5 days ago | parent | prev | next [-]

> Getting customers to pay for the weights

Provide the weights as an add-on for customers who pay for hardware to run them. The customers will be paying for weights + hardware. I think it is the same model as buying the hardware and get the macOS for free. Apple spends $35B a year in R&D. Training GPT5 cost ~$500M. It is a nothing burger for Apple to create a model that runs locally on their hardware.

novok 5 days ago | parent [-]

That is functionally much harder to pull off than software because model weights are essentially more like raw media files than code, and that is much easier to convert to another runtime

firesteelrain 5 days ago | parent [-]

Codeium had an airgap solution until they were in talks with OpenAI and pulled it back. It worked on prem and they even told you what hardware to buy

novok 5 days ago | parent [-]

You can still extract the model weights from an on-prem machine. It has all the same problems of media DRM, and large enterprises do not accept unknown recording and surveillance that they cannot control

firesteelrain 5 days ago | parent [-]

I am not sure what you mean. I work at a large Enterprise and we did not unleash it on our baseline and it couldn’t phone home but it was really good for writing unit tests. That sped things up for us.

esseph 5 days ago | parent | prev [-]

There is no moat.

Wowfunhappy 5 days ago | parent | prev | next [-]

Notably, Apple is pushing for local models, albeit not open ones and with very limited success.

utyop22 5 days ago | parent | prev | next [-]

Apple will eventually figure it out. Remember the iPhone took 5 years to develop - they don’t rush this stuff.

kjkjadksj 5 days ago | parent | prev [-]

Why do that when openai might pay you billions to be the primary ai model on your system?

username332211 5 days ago | parent | prev | next [-]

> Fiat privacy still has all the same problems as telling your therapist that you killed someone, or keeping your wallet keys printed out on paper in a safe.

They could take a lesson from churches. If LLM providers and their employees were willing to commit to privacy and were willing to sacrifice their wealth and liberty for the sake of their clients, society would yield.

I remember seeing a video of a certain Richard Masten, a CrimeStoppers coordinator, destroying the information he had on a confidential source right in the courtroom under the threat of a contempt charge and getting away with a slap on the wrist.

In decent societies standing up for principles does work.

socalgal2 5 days ago | parent | prev | next [-]

> Sam Altman made a comparison to attorney-client privilege in an interview

Isn't his company, OpenAI, the one that said the monitor all communications and will report anyone they think is a threat to the government?

https://openai.com/index/helping-people-when-they-need-it-mo...

> If human reviewers determine that a case involves an imminent threat of serious physical harm to others, we may refer it to law enforcement.

I get they are trying to do something positive overall. At the same time. I don't want corp owned AI that's monitoring everything I ask it.

IIRC it is illegal for the phone company to monitor and censor communications. The government can ask a judge for permission for police to monitor a line but otherwise it's illegal. But now with AI transcription it won't be long until a company can monitor every call, transcribe it, feed to an LLM to judge and decide which lists you should be on.

felipeerias 5 days ago | parent [-]

But there isn’t a person on the other side whom you are reaching through their service. The only communication is between you and the OpenAI server that takes in your input message and produces an output.

I understand that people assume LLMs are private but there isn’t any guarantee that is the case, specially when law enforcement comes knocking.

floundy 5 days ago | parent | prev | next [-]

The tech companies have wised up and they'll continue to speak idyllically about what "should be" and maybe even deploy watered-down versions of it, but really they're just buying time to where they can get even bigger and capture more power before the government even thinks of stepping in. The nice thing about being first to market is you can abuse the market, abuse customers, pay a few trivial class action lawsuits along the way, then when regulations finally lag along you've got hundreds of billions worth of market power behind you to bribe the politicians. The US govt won't do anything about AI companies for at least 5 years, and when they do OpenAI, Google, and Meta will all be sitting at the table holding the pen.

sensanaty 4 days ago | parent | prev | next [-]

Sam "Harvest Your Biometric Data For A Scamcoin" Altman? Real trustworthy bloke, I'm sure. We should all buy some worldcoin by giving him our eye scans, in the name of privacy of course.

Melting_Harps 4 days ago | parent [-]

> Real trustworthy bloke

You do realize he became King maker by position himself in YC, who is the owner/operator of Hackernews. What makes you think you are not being traced here, and your messages are not being used to train his LLM?

As far as him being a conman, if you haven't realized that most of the SV elite, that this place worships, are all conmen (See Trump Dinner this week) with clear ties to the intelligence agency (see newly appointed generals who are C-suite in several Mag 7 corps) who will placate a fascist in order to push their agenda(s) then you simply aren't paying attention.

His scam coin is the most insipid of his rap sheet at this point, and I say this as a person who has seen all kind of grifting in that space.

j45 4 days ago | parent | prev | next [-]

I wonder if pushing for privacy protections is due to seeing such concerns of intrusion on privacy already on the horizon.

mathgradthrow 5 days ago | parent | prev | next [-]

Anyone can just kill you whenever they want. Security cannot be granted by cryptography, only secrecy.

alfalfasprout 5 days ago | parent | prev [-]

You really think that Altman won’t turn around and start selling ads once enough people are on OpenAI’s “trusted” platform?