Remix.run Logo
ramoz 3 hours ago

Unfortunately, verifiable privacy is not physically possible on MacBooks of today. Don't let a nice presentation fool you.

Apple Silicon has a Secure Enclave, but not a public SGX/TDX/SEV-style enclave for arbitrary code, so these claims are about OS hardening, not verifiable confidential execution.

It would be nice if it were possible. There's a lot of cool innovations possible beyond privacy.

znnajdla 2 hours ago | parent | next [-]

As if you get privacy with the inference providers available today? I have more trust in a randomly selected machine on a decentralized network not being compromised than in a centralized provider like OpenAI pinky promising not to read your chats.

ramoz 2 hours ago | parent [-]

Inference providers don't claim private inference. However, they must uphold certain security and legal compliances.

You have no guarantees over any random connected laptop connected across the world.

rz2k 13 minutes ago | parent [-]

Kagi Assistant has a five point scale of estimated privacy levels of models hosted on different providers.

https://help.kagi.com/kagi/ai/llms-privacy.html

geon 3 hours ago | parent | prev [-]

Every hardware key will be broken if there is enough incentive to do so. Their claims read like pure hubris.

znnajdla 2 hours ago | parent [-]

Who cares about AI privacy? Most people don’t. If you do, run locally.