| ▲ | anematode 6 days ago |
| A lot of people will still use npm, so they'll be the canaries in the coal mine :) More seriously, automated scanners seem to do a good job already of finding malicious packages. It's a wonder that npm themselves haven't already deployed an automated countermeasure. |
|
| ▲ | kjok 5 days ago | parent | next [-] |
| > automated scanners seem to do a good job already of finding malicious packages. That's not true. This latest incident was detected by an individual researcher, just like many similar attacks in the past. Time and again, it's been people who flagged these issues, later reported to security startups, not automated tools. Don't fall for the PR spin. If automated scanning were truly effective, we'd see deployments across all major package registries. The reality is, these systems still miss what vigilant humans catch. |
| |
| ▲ | kelnos 5 days ago | parent | next [-] | | > This latest incident was detected by an individual researcher So that still seems fine? Presumably researchers are focusing on latest releases, and so their work would not be impacted by other people using this new pnpm option. | |
| ▲ | hobofan 5 days ago | parent | prev | next [-] | | > If automated scanning were truly effective, we'd see deployments across all major package registries. No we wouldn't. Most package registries are run by either bigcorps at a loss or by community maintainers (with bigcorps again sponsoring the infrastructure). And many of them barely go beyond the "CRUD" of package publishing due to lack of resources. The economic incentives of building up supply chain security tools into the package registries themselves are just not there. | | |
| ▲ | kjok 5 days ago | parent [-] | | You're right that registries are under-resourced. But, if automated malware scanning actually worked, we'd already see big tech partnering with package registries to run continuous, ecosystem-wide scanning and detection pipelines. However, that isn't happening. Instead, we see piecemeal efforts from Google with assurance artifacts (SLSA provenance, SBOMs, verifiable builds), Microsoft sponsoring OSS maintainers, Facebook donating to package registries. Google's initiatives stop short of claiming they can automatically detect malware. This distinction matters. Malware detection is, in the general case, an undecidable problem (think halting problem and Rice theorem). No amount of static or dynamic scanning can guarantee catching malicious logic in arbitrary code. At best, scanners detect known signatures, patterns, or anomalies. They can't prove absence of malicious behavior. So the reality is: if Google's assurance artifacts stop short of claiming automated malware detection is feasible, it's a stretch for anyone else to suggest registries could achieve it "if they just had more resources." The problem space itself is the blocker, not just lack of infra or resources. | | |
| ▲ | motorest 5 days ago | parent [-] | | > But, if automated malware scanning actually worked, we'd already see big tech partnering with package registries to run continuous, ecosystem-wide scanning and detection pipelines. I think this sort of thought process is misguided. We do see continuous, ecosystem-wide scanning and detection pipelines. For example, GitHub does support DependaBot, which runs supply chain checks. https://github.com/dependabot What you don't see is magical rabbits being pulled out of top hats. The industry has decades of experience with anti-malware tools in contexts where said malware runs in spite of not being explicitly provided deployment or execution permissions. And yet it deploys and runs. What do you expect if you make code intentionally installable and deployable, and capable of sending HTTP requests to send and receive any kind of data? Contrary to what you are implying, this is not a simple problem with straight-forward solutions. The security model has been highly reliant on the role of gatekeepers, both in producer and consumer sides. However, the last batch of popular supply chain attacks circumvented the only failsafe in place. Beyond this point, you just have a module that runs unspecified code, just like any other module. |
|
| |
| ▲ | anematode 5 days ago | parent | prev [-] | | The latest incident was detected first by an individual researcher (haven't verified this myself, but trusting you here) -- or maybe s/he was just the fastest reporter in the west. Even simple heuristics like the sudden addition of high-entropy code would have caught the most recent attacks, and obviously there are much better methods too. |
|
|
| ▲ | mcintyre1994 6 days ago | parent | prev | next [-] |
| In the case of the chalk/debug etc hack, the first detection seemed to come from a CI build failure it caused: https://jdstaerk.substack.com/p/we-just-found-malicious-code... > It started with a cryptic build failure in our CI/CD pipeline, which my colleague noticed > This seemingly minor error was the first sign of a sophisticated supply chain attack. We traced the failure to a small dependency, error-ex. Our package-lock.json specified the stable version 1.3.2 or newer, so it installed the latest version 1.3.3, which got published just a few minutes earlier. |
| |
| ▲ | DougBTX 6 days ago | parent [-] | | > Our package-lock.json specified the stable version 1.3.2 or newer Is that possible? I thought the lock files restricted to a specific version with an integrity check hash. Is it possible that it would install a newer version which doesn't match the hash in the lock file? Do they just mean package.json here? | | |
| ▲ | streptomycin 6 days ago | parent | next [-] | | If they were for some reason doing `npm install` rather than `npm ci`, then `npm install` does update packages in the lock file. Personally I always found that confusing, and yarn/pnpm don't behave that way. I think most people do `npm ci` in CI, unless they are using CI to specifically test if `npm install` still works, which I guess maybe would be a good idea if you use npm since it doesn't like obeying the lock file. | | |
| ▲ | Rockslide 6 days ago | parent [-] | | How does this get repeated over and over, when it's simply not true? At least not anymore. npm install will only update the lockfile if you make changes to your package.json. Otherwise, it will install the versions from the lockfile. | | |
| ▲ | mirashii 5 days ago | parent | next [-] | | > How does this get repeated over and over, when it's simply not true? Well, for one, the behavior is somewhat insane. `npm install` with no additional arguments does update the lockfile if your package.json and your lockfile are out of sync with one another for any reason, and so to get a guarantee that it doesn't change your lockfile, you must do additional configuration or guarantee by some external mechanism that you don't ever have an out of date package.json and lock. For this reason alone, the advice of "just don't use npm install, use npm ci instead" is still extremely valid, you'd really like this to fail fast if you get out of sync. `npm install additional-package` also updates your lock file. Other package managers distinguish these two operations, with the one to add a new dependency being called "add" instead of "install". The docs add to the confusion. https://docs.npmjs.com/cli/v11/commands/npm-install#save suggests that writing to package-lock.json is the default and you need to change configuration to disable it. The notion that it won't change your lock file if you're already in sync between package.json and package-lock.json is not actually spelled out clearly anywhere on the page. > At least not anymore. You've partially answered your own question here. | | |
| ▲ | Rockslide 5 days ago | parent [-] | | > You've partially answered your own question here. Is that the case? If it were ever true (outside of outright bugs in npm), it must have been many many years and major npm releases ago. So that doesn't justify brigading outdated information. | | |
| ▲ | chowells 5 days ago | parent [-] | | I mean, it's my #1 experience using npm. I never once have used `npm install` and had a result other than it changing the lockfile. Maybe you want to blame this on the tools I used, but I followed the exact installation instructions of the project I was working on. If it's that common to get it "wrong", it's the tool that is wrong. |
|
| |
| ▲ | streptomycin 5 days ago | parent | prev | next [-] | | My bad, it really annoyed me when npm stopped respecting lockfiles years ago so I stopped using it. That's great news that they eventually changed their mind. However in rare cases where I am forced to use it to contribute to some npm-using project, I have noticed that the lockfile often gets updated and I get a huge diff even though I didn't edit the dependencies. So I've always assumed that was the same issue with npm ignoring the lockfile, but maybe it's some other issue? idk | | |
| ▲ | Rockslide 5 days ago | parent [-] | | Well there are other lockfile updates as well, which aren't dependency version changes either. e.g. if the lockfile was created with an older npm version, running npm install with a newer npm version might upgrade it to a newer lockfile format and thus result in huge diffs. But that wouldn't change anything about the versions used for your dependencies. |
| |
| ▲ | cluckindan 5 days ago | parent | prev [-] | | Are you 100% on that? | | |
| ▲ | Rockslide 5 days ago | parent [-] | | Yes. As someone who's using npm install daily, and given the update cadence of npm packages, I would end up with dirty lock files very frequently if the parent statement were true. It just doesn't happen. |
|
|
| |
| ▲ | hobofan 5 days ago | parent | prev | next [-] | | Since nobody else answers your question: > Do they just mean package.json here? Yes, most likely. A package-lock.json always specifies an exact version with hash and not a "version X or newer". | |
| ▲ | Mattwmaster58 6 days ago | parent | prev [-] | | > Is that possible? This comes up every time npm install is discussed. Yes, npm install will "ignore" your lockfile and install the latest dependancies it can that satisfy the constraints of your package.json. Yes, you should use npm clean-install. One shortcoming is the implementation insists on deleteing the entire node_modules folder, so package installs can actually take quite a bit of time, even when all the packages are being served from the npm disk cache: https://github.com/npm/cli/issues/564 |
|
|
|
| ▲ | 6 days ago | parent | prev | next [-] |
| [deleted] |
|
| ▲ | vasachi 6 days ago | parent | prev [-] |
| If only there was a high-ranking official at Microsoft, who could prioritize security[1]! /s [1] https://blogs.microsoft.com/blog/2024/05/03/prioritizing-sec... |