▲ | tonygiorgio 11 hours ago | |||||||||||||
> Although PCC is currently unique to Apple, we can hope that other privacy-focused services will soon crib the idea. IMHO, Apple's PCC is a step in the right direction in terms of general AI privacy nightmares where they are at today. It's not a perfect system, since it's not fully transparent and auditable, and I do not like their new opt-out photo scanning feature running on PCC, but there really is a lot to be inspired by it. My startup is going down this path ourselves, building on top of AWS Nitro and Nvidia Confidential Compute to provide end to end encryption from the AI user to the model running on the enclave side of an H100. It's not very widely known that you can do this with H100s but I really want to see this more in the next few years. | ||||||||||||||
▲ | mnahkies 10 hours ago | parent | next [-] | |||||||||||||
I didn't actually realize that AWS supported this, I thought Azure was the only one offering it (https://azure.microsoft.com/en-us/blog/azure-confidential-co...) Are you speaking of this functionality? https://developer.nvidia.com/blog/confidential-computing-on-... (and am I just failing to find the relevant AWS docs?) | ||||||||||||||
| ||||||||||||||
▲ | blueblimp 10 hours ago | parent | prev [-] | |||||||||||||
And the most important thing about PCC in my opinion is not the technical aspect (though that's nice) but that Apple views user privacy as something good to be maximized, differing from the view championed by OpenAI and Anthropic (and also adopted by Google and virtually every other major LLM provider by this point) that user interactions must be surveilled for "safety" purposes. The lack of privacy isn't due to a technical limitation--it's intended, and they often brag about it. | ||||||||||||||
|