Remix.run Logo
jefozabuss 3 days ago

Seems like people already forgot about Jia Tan.

By the way why doesn't npm have already a system in place to flag sketchy releases where most of the code looks normal and there is a newly added obfuscated code with hexadecimal variable names and array lookups for execution...

mystifyingpoi 3 days ago | parent | next [-]

Detecting sketchy-looking hex codes should be pretty straightforward, but then I imagine there are ways to make sketchy code non-sketchy, which would be immediately used. I can imagine a big JS function, that pretends to do legit data manip, but in the process creates the payload.

hombre_fatal 3 days ago | parent | next [-]

Yeah, It’s merely a fluke that the malware author used some crappy online obfuscator that created those hex code variables. It would have been less work and less suspicious if they just kept their original semantic variables like “originalFetch”.

nicce 3 days ago | parent | prev | next [-]

It is just about bringing the classic non-signature based antivirus software to the release cycle. Hard to say how useful it is, but usually it is endless cat-and-mouse play like with everything else.

Cthulhu_ 3 days ago | parent | prev | next [-]

It wouldn't be just one signal, but several - like a mere patch version that adds several kilobytes of code, long lines, etc. Or a release after a long silent period.

cluckindan 3 days ago | parent | prev | next [-]

A complexity per line check would have flagged it.

Even a max line length check would have flagged it.

chatmasta 3 days ago | parent [-]

That would flag a huge percentage of JS packages that ship with minified code.

cluckindan 3 days ago | parent | next [-]

Why would you be including minified code in a build? That’s just bad practice and makes development-time debugging more difficult.

saghm 2 days ago | parent | prev | next [-]

It's not like minified JS can't be parsed and processed as AST. You could still pretty easily split up each statement/assignment to check the length of each one individually.

jay_kyburz 3 days ago | parent | prev [-]

How are people verifying their dependencies if they are minified?

SethTro 3 days ago | parent | next [-]

That's the magic part, they aren't

chatmasta 3 days ago | parent | prev [-]

My guy… in the JS ecosystem a “lock file” is something that restricts your package installer to an arbitrary range of packages, i.e. no restrictions at all and completely unpredictable. You have to go out of your way to “pin” a package to a specific version.

Izkata 3 days ago | parent [-]

Lockfiles use exact hashes, not versions/version ranges. Javascript projects use two files, a package file with version ranges (used when upgrading) and a lockfile with the exact version (used in general when installing in an existing project).

chatmasta 3 days ago | parent | next [-]

Sure, but a lockfile with a hash doesn’t mean that next time it will fail if it tries to install a version of the package without that hash. If your package.json specifies a semver range then it’ll pull the latest minor or patch version (which is what happened in this case with e.g. duckdb@1.3.3) and ignore any hash differences if the version has changed. Hence why I say you need to go out of your way to specify an exact version in package.json and then the lock file will work as you might expect a “lock” file to work. (Back when I was an engineer and not a PM with deteriorating coding ability, I had to make a yarn plugin to pin each of our dependencies.)

The best way to manage JS dependencies is to pin them to exact versions and rely on renovate bot to update them. Then at least it’s your choice when your code changes. Ideally you can rebuild your project in a decade from now. But if that’s not possible then at least you should have a choice to accept or decline code changes in your dependencies. This is very hard to achieve by default in the JS ecosystem.

jay_kyburz 3 days ago | parent [-]

I think at some point you would be better off vendoring them in.

chatmasta 3 days ago | parent | next [-]

That’s effectively what I did in a very roundabout way with docker images and caching that ended up abusing the GitLab free tier for image hosting. When you put it like that it does make me think there was a simpler solution, lol.

When I’m hacking on a C project and it’s got a bunch of code ripped out of another project, I’m like “heh, look at these primordial dependency management practices.” But five years later that thing is gonna compile no problem…

cluckindan 3 days ago | parent | prev [-]

There’s even a command for that: npm pack

zdragnar 3 days ago | parent | prev [-]

NPM is rather infamous for not exactly respecting the lockfile, however.

cchance 3 days ago | parent | prev [-]

Feels like a basic light weight 3b AI model could easily spot shit like this on commit

tom1337 3 days ago | parent | prev | next [-]

It would also be great if a release needs to be approved by the maintainer via a second factor or an E-Mail verification. Once a release has been published to npm, you have an hour to verify it by clicking a link in an email and then enter another 2FA (separate OTP than for login, Passkey, Yubikey whatever). That would also prevent publishing with lost access keys. If you do not verify the release within the first hour it gets deleted and never published.

naugtur 3 days ago | parent | next [-]

That's why we never went with using keys in CI for publishing. Local machine publishing requires a 2fa.

automated publishing should use something like Pagerduty to signal that a version is being published to a group of maintainers and it requires an approval to go through. And any one of them can veto within 5 minutes.

But we don't have that, so gotta be careful and prepare for the worst (use LavaMoat for that)

Cthulhu_ 3 days ago | parent | prev [-]

Not through e-mail links though, that's what caused this in the first place. E-mail notification, sure, but they should also do a phishing training mail - make it legit, but if people press the link they need to be told that NPM will never send them an email with a link.

dist-epoch 3 days ago | parent | prev | next [-]

> flag sketchy releases

Because the malware writers will keep tweaking the code until it passes that check, just like virus writers submit their viruses to VirusTotal until they are undetected.

galaxy_gas 3 days ago | parent [-]

its Typical that the Virus Writer will use their own service, there is criminal virustotal-clones that run many AV in VM and return the Results, because virustotal will share all binaries, anything upload in Virustotal will be detteceted shortly if it is not.

47282847 3 days ago | parent [-]

Isn’t it still that when signatures are added at some point it turns out that the malware code has been uploaded months before, or did that change?

3 days ago | parent | prev | next [-]
[deleted]
AtNightWeCode 3 days ago | parent | prev | next [-]

The problem is that it is even possible to push builds from dev machines.

madeofpalk 3 days ago | parent [-]

With NPM now supporting OIDC, you can just turn this off now https://docs.npmjs.com/trusted-publishers

hulitu 2 days ago | parent | prev [-]

> By the way why doesn't npm have already a system in place to flag sketchy releases

Because nobody gives a fsck. Normally, after npm was filled with malware, people would avoid it. But it seems that nobody (distro maintainers) cares. People get what they asked for (malware).