Remix.run Logo
henearkr 3 hours ago

Is there any evidence that the mechanism to do that is in place?

I think that would be widely decried especially on HN if that is one day implemented.

chrismorgan 3 hours ago | parent | next [-]

You need mechanisms to avoid the possibility. The mechanisms to do such things exist by default, by both the software provider (e.g. Proton) and the software distributor (e.g. Apple for App Store, Google for Play Store, Cloudflare or AWS for web stuff), and various countries have laws that allow them to secretly compel implementing specific backdoors.

In order to block the distributor from going rogue, you need to be able to guarantee that the user device can only install/run code signed by the provider, who must never give those keys to the distributor. My impression is that Android is the only major platform that ever had this, but that Google ruined it a few years ago in the name of lighter bundles by insisting that they hold the keys. (I once had VLC from Google Play Store, but replaced it with a build from F-Droid under the same app ID; Google Play Store shows it has an update for it, but that it can’t install it.)

In order to block the provider or distributor sending specific users a different build, you need something more like Certificate Transparency logs: make it so that devices will only run packages that contains proof that they have been publicly shared. (This is necessary, but not sufficient.)

And if you’re using web tech, the mechanisms required to preclude such abuse do not at this time exist. If you’re shipping an app by some other channel, it can do a resource integrity check and mandate subresource integrity. But no one does things that way—half the reason for using web tech is specifically to bypass slow update channels and distribute new stuff immediately!

Cthulhu_ 3 hours ago | parent | prev [-]

Yes? A/B testing flags, auto-updates, server-side re-routing, etc are just some mechanisms from the top of my head that can do that.

The ways to avoid it is by having locked and cryptographically verified software and connections.

izacus 3 hours ago | parent [-]

That's not evidence, that's conjecture again. Is there evidence that this kind of client push is actually used to extract data in these projects?

nextaccountic 3 hours ago | parent | next [-]

That's evidence for the mechanism, as asked

The evidence that it's being actively used in the US is in the secret proceedings of a secret court. I kid you not, look up FISA warrant

Imustaskforhelp 3 hours ago | parent | prev [-]

Not sure if that counts as proper evidence, but I have seen some logs[0] albeit with encryption but from my understanding, they control the encryption keys or atleast certainly have the ability to change (if they get hacked themselves for example)

Would you like to see a proper evidence of the logging policy? I feel like I can try finding that again if you/HN community would be interested to see that.

Edit: also worth pointing out that keeping logs with time might be a form of meta-data, which depending on your threat-vector (journalism etc.) can be very sensitive info.

[0]: my another comment here: https://news.ycombinator.com/item?id=47624960

izacus 2 hours ago | parent [-]

I'd like to see any kind of evidence that there's any substance of in these accusations of services not actually being private - not just theoretical theorycrafting about mechanisms.

And how does that compare to other services we have available and people actually use.