Remix.run Logo
grafmax 3 days ago

Technology is insecure all the way down to the hardware. The structural cause of this is that companies aren’t held liable for insecure products, which are cheaper to build.

So companies’ profit motives contribute to this mess not just through the exploitation of open source labor (as you describe) but through externalizing security costs as well.

stingraycharles 3 days ago | parent [-]

Isn’t all this stuff with Secure Enclave supposed to address these kind of things?

It’s my take that over the past ~ decade a lot of these companies have been making things a lot better, Windows even requires secure boot these days as well.

acdha 3 days ago | parent | next [-]

They’re not the same problems. The Secure Enclave protects things like your biometrics, hardware-backed keys (e.g. on a Mac, WebAuthn and iCloud Keychain), and the integrity of the operating system but not every bit of code running as your account. That means that an NPM install can’t compromise your OS to the point that you can’t recover control, but it means the attacker can get everything you haven’t protected using sandbox features.

That’s the path out of this mess: not just trying to catch it on NPM but moving sensitive data into OS-enforced sandboxes (e.g. Mac containers) so every process you start can’t just read a file and get keys, and using sandboxing features in package managers themselves to restrict when new installs can run code and what they can do (e.g. changing the granularity from “can read any file accessible to the user” to “can read a configuration file at this location and data files selected by the user”), and tracking capability changes (“the leftpad update says it needs ~/.aws in this update?”).

We need to do that long-term but it’s a ton of work since it breaks the general model of how programs work we’ve used for most the last century.

felixgallo 3 days ago | parent [-]

it's not clear that the solution to this problem is to create several additional layers of barn doors.

acdha 2 days ago | parent [-]

That doesn’t make sense: it’s like arguing that it wasn’t useful to have boat design switch to compartmentalization in addition to trying to avoid hitting things. You can spend a lot of effort trying to ensure bad code never arrives but unless that’s perfect you also want to think about how to make it less catastrophic.

felixgallo 2 days ago | parent [-]

the proposed idea does not reduce the attack surface or make anything easier or less catastrophic.

acdha a day ago | parent [-]

You might want to reread more carefully. Using the OS security features to restrict what the code you just installed can do prevents immediate attacks and gives you a chance to notice suspicious activity. If the only way to read a file is for the package to request permission and a scope, that gives you a chance to notice it (huh, why does tiny-color need ~/.GitHub?) and also serves as a triage cue for scanning pipelines to flag updates, especially minor ones, which increase the scope of the requested permissions.

Using OS features to restrict access to sensitive data similarly gives you another chance to detect a compromise because a denied operation to, say, read your wallet by an app which doesn’t need to is both highly visible and unambiguous.

felixgallo a day ago | parent [-]

I can read, thank you. The specific problems are that your 'prevent immediate attacks' and 'gives you a chance' are both doing significantly more work than you'd like to admit. A large project can use hundreds of npm packages, with the total dependency tree in the thousands. Your choices are to either give them infinite dialog fatigue on every single npm update, or make security-weakening tradeoffs. And if you ever let any of the packages create a new window and draw to it, that's game over. Even without malicious dialogs, users will continue to make bad choices, and 99.9% of all non-developer users and 99.8% of all developer users will accept or even broaden insecure defaults when prompted.

The problem is coming from inside the house.

snickerdoodle14 3 days ago | parent | prev [-]

Not really, those technologies are basically designed to be able to enforce DRM remotely.

Secure Enclave = store the encryption keys to media in a place where you can't get them

Secure Boot = first step towards remote attestation so they can remotely verify you haven't modified your system to bypass the above

Advertising rules the world.

brookst 3 days ago | parent [-]

How is that different?

Is there such a thing as secure hardware that can prevent supply chain attacks (by enabling higher layers to guarantee security) and secure hardware that prevents copying data (by enabling higher layers to guarantee security)?

snickerdoodle14 3 days ago | parent [-]

Sure. Malware tends not to have physical hands that can touch the machine and any buttons attached to it. Physical ownership should be true ownership, but they're taking that away from you.