Remix.run Logo
simpaticoder 4 days ago

I've come to the conclusion that avoiding the npm registry is a great benefit. The alternative is to import packages directly from the (git) repository. Apart from being a major vector for supply-chain attacks like this one, it is also true that there is little or no coupling between the source of a project and its published code. The 'npm publish' step takes pushes local contents into the registry, meaning that a malefactor can easily make changes to code before publishing.

HexDecOctBin 4 days ago | parent | next [-]

As a C developer, having being told for a decade that minimising dependencies and vendoring stuff straight from release is obsolete and regressive, and now seeing people have the novel realisation that it's not, is so so surreal.

Although I'll still be told that using single-header libraries and avoiding the C standard library are regressive and obsolete, so gotta wait 10 more years I guess.

dpc_01234 4 days ago | parent | next [-]

NPM dev gets hacked, packages compromised, it's detected within couple of hours.

XZ got hacked, it reached development versions of major distributions undetected, right inside an _ssh_, and it only got detected due to someone luckily noticing and investigated slow ssh connections.

Still some C devs will think it's a great time to come out and boast about their practices and tooling. :shrug:

grayhatter 4 days ago | parent | next [-]

xz didn't get hacked (phished).

For xz an advanced persistent threat, inserted hypertargeted self modifying code into a tarball.

A single npm dev was "hacked" (phished) by a moderate effort, (presumably drive by) crypto thief.

I have no idea what you meant by "right inside _ssh_" but I don't think that's a good description of what actually happened in any possible case.

I'm unlikely to defend C devel practices but this doesn't feel like an indictment of C, if anything the NPM ecosystem looks worse by this comparison. Especially considering the comment you replied to was advocating for minimizing dependencies, which if the distros effected by xz being compromised had followed, (instead of patching sshd) they wouldn't have shipped a compromised version.

typpilol 4 days ago | parent | prev [-]

Lol it's so true.. the C smugness is unmatched

1718627440 3 days ago | parent | prev | next [-]

This isn't part of the current discussion, but what is the appeal of single-header libraries?

Most times they actually are a normal .c/.h combo, but the implementation was moved to the "header" file and is simply only exposed by defining some macro. When it is actually a like a single file, that can be included multiple times, there is still code in it, so it is only a header file in name.

What is the big deal in actually using the convention like it is intended to and name the file containing the code *.c ? If is intended to only be included this can be still done.

> avoiding the C standard library are regressive and obsolete

I don't understand this as well, since the one half of libc are syscall wrappers and the other half are primitives which the compiler will use to replace your hand-rolled versions anyway. But this is not harming anyone and picking a good "core" library will probably make your code more consistent and readable.

dzaima 2 days ago | parent [-]

With just a single file you can trivially use it such that everything is inlined (if it's of the sort that static-s all functions, at least), even across multiple files using it, without needing the full compile-time-destruction of LTO.

And generally it's one less file to look at, more easy to copy-paste into your project (and as a very minor security benefit you'll potentially look at arbitrary subsets of the contents every time you do a go-to-definition or use the header as docs (thus having chances to notice oddities) instead of just looking at a header).

dboon 4 days ago | parent | prev [-]

Yeah lol I’m making a C package manager for exactly this. No transitive dependencies, no binaries served. Just pulling source code, building, and being smart about avoiding rebuilds.

eviks 4 days ago | parent [-]

Being smart about avoiding rebuilds is serving prebuilds

aabbccsmith 4 days ago | parent | prev | next [-]

npm's recent provenance feature fixes this, and it's pretty easy to setup. It will seriously help prevent things like this from ever happening again, and I'm really glad that big packages are starting to use it.

billywhizz 4 days ago | parent [-]

> When a package in the npm registry has established provenance, it does not guarantee the package has no malicious code. Instead, npm provenance provides a verifiable link to the package's source code and build instructions, which developers can then audit and determine whether to trust it or not

OptionOfT 4 days ago | parent [-]

It prevents the npm publish from locally modified source code.

typpilol 4 days ago | parent | prev | next [-]

You can do some weird verify thing on your GitHub builds now when they publish to npm, but I've noticed you can still publish from elsewhere even with it pegged to a build?

But maybe I'm misunderstanding the feature

komali2 4 days ago | parent | prev | next [-]

Do you do this in your CI as well? E.g. if you have a server somewhere that most would run `npm install` on builds, you just `git clone` into your node_modules or what?

cstrahan 4 days ago | parent | prev [-]

> The alternative is to import packages directly from the (git) repository.

That sounds great in theory. In practice, NPM is very, very buggy, and some of those bugs impact pulling deps from git repos. See my issue here: https://github.com/npm/cli/issues/8440

Here's the history behind that:

Projects with build steps were silently broken as late as 2020: https://github.com/npm/cli/issues/1865

Somehow no one thought to test this until 2020, and the entire NPM user base either didn't use the feature, or couldn't be arsed to raise the issue until 2020.

The problem gets kinda sorta fixed in late 2020: https://github.com/npm/pacote/issues/53

I say kinda sorta fixed, because somehow they only fixed (part of) the problem when installing package from git non-globally -- `npm install -g whatever` is still completely broken. Again, somehow no one thought to test this, I guess. The issue I opened, which I mentioned at the very beginning of this comment, addresses this bug.

Now, I say "part of of the problem" was fixed because the npm docs blatantly lie to you about how prepack scripts work, which requires a workaround (which, again, only helps when not installing globally -- that's still completely broken); from https://docs.npmjs.com/cli/v8/using-npm/scripts:

    prepack
    
        - Runs BEFORE a tarball is packed (on "npm pack", "npm publish", and when installing a git dependencies).
Yeah, no. That's a lie. The prepack script (which would normally be used for triggering a build, e.g. TypeScript compilation) does not run for dependencies pulled directly from git.

Speaking of TypeScript, the TypeScript compiler developers ran into this very problem, and have adopted this workaround, which is to invoke a script from the npm prepare script, which in turn does some janky checks to guess if the execution is occuring from a source tree fetched from git, and if so, then it explicitly invokes the prepack script, which then kicks off compiler and such. This is the workaround they use today:

https://github.com/cspotcode/workaround-broken-npm-prepack-b...

... and while I'm mentioning bugs, even that has a nasty bug: https://github.com/cspotcode/workaround-broken-npm-prepack-b...

Yes, if the workaround calls `npm run prepack` and the prepack script fails for some reason (e.g. a compiler error), the exit code is not propagated, so `npm install` will silently install the respective git dependency in a broken state.

How no one looks at this and comes to the conclusion that NPM is in need of better stewardship, or ought to be entirely supplanted by a competing package manager, I dunno.