| |
| ▲ | em-bee a day ago | parent | next [-] | | which distributions? and did you submit the packages yourself or did someone else from the distribution do the work? yes, there is a trust relationship, but from what i have seen about the submission process in debian, you can't just sign up and start uploading packages. a submitter receives mentoring and their initial packages are reviewed until it can be established that the person learned how to do things and can be trusted to handle packages on their own. they get GPG keys to sign the packages, and those keys are signed by other debian members. possibly even an in person meeting is required if the person is not already known to their mentors somehow. every new package is vetted too, and only updates are trusted to the submitter on their own once they completed the mentoring process. fedora and ubuntu should be similar. i don't know about others. in the distribution where i contributed (foresight) we only packaged applications that were known and packaged in other distributions. sure, if an app developer went rogue, we might not have noticed, and maybe debian could suffer from the same fate but that process is still much more involved than just letting anyone register an account and upload their own packages without any oversight at all. | | |
| ▲ | woodruffw a day ago | parent | next [-] | | > did someone else from the distribution do the work? Someone else. To be clear: I find the Debian maintainers trustworthy. But I don't think they're equipped to adequately review the existing volume of a packages to the degree that I would believe an assertion of security/non-maliciousness, much less the volume that would come with re-packaging all of PyPI. (I think the xz incident demonstrated this tidily: the backdoor wasn't caught by distro code review, but by a performance regression.) | |
| ▲ | jowea a day ago | parent | prev | next [-] | | I've contributed some packages to NixOS, I didn't do code review and as far as I can tell nothing told me I had to. I assume that if I had said the code was hosted at mispeledwebsite.co.cc/foo in the derivation instead of github.com/foo/foo or done something obviously malicious like that the maintainers would have sanity checked and stopped me, but I don't think anyone does code review for a random apparently useful package. And if github.com/foo/foo is malicious code then it's going to go right through. And isn't the Debian mentoring and reviewing merely about checking if the package is properly packaged into the Debian format and properly installs and includes dependencies etc? I don't think there is anything actually stopping some apparently safe code from ending up in Linux distros, except the vague sense of "given enough eyeballs, all bugs are shallow", i.e. that with everyone using the same package, someone is going to notice something, somehow. | |
| ▲ | codedokode a day ago | parent | prev [-] | | Maybe Linux distributions should do it other way; they do not need to provide every software package via trusted repositories. Instead, they should provide a small set of trusted packages to create a standard execution environment, and run everything else in a sandbox. This way one can install any third-party software safely and maintainers have less work to do. And software developers do not need to create a package for every of hundreds distributions. | | |
| ▲ | em-bee 9 hours ago | parent [-] | | that is actually happening. fedora has copr where anyone can create a personal repository to upload their packages. suse has something similar which surprisingly is able to support all multiple major distributions, including debian. those are effectively repo hosting services that anyone could provide. the difference to pypi/npm/rubygems etc is that nobody would upload a package to a personal repo with a dozen dependencies from other personal repos. when i install a copr package i can be sure that all dependencies are either from the trusted fedora repo or from that same personal repo. that means i only need to trust that one developer alongside the official distribution. unlike npm or pypi where i have to trust that each submitter vetted their own dependencies, or vet them myself, which is also unrealistic. | | |
| ▲ | codedokode 5 hours ago | parent [-] | | No, they are not. Neither Fedora nor Debian have any sandboxing and if you add a third-party repository, it gets root access to your system and can run any scripts when installing or updating software. Also what I meant is a "standard execution environment", so that the developer doesn't need to make a separate version for each Linux distribution, and doesn't have to make repositories. | | |
| ▲ | em-bee 24 minutes ago | parent [-] | | sorry, i misread that. i thought you were just talking about the trust and vetting issue. i glanced over "sandboxing". sandboxing apps is what android is doing and i think nixos and also flatpack, etc. and with flatpack, that approach is effectively already possible, and in a way, already in the works. but it's a different approach, one that i don't like at all, because it is way to heavy handed and makes interoperability between apps very difficult. it also doesn't solve the trust problem, because at the end of the day the sandboxed app still needs access to my data, so i still need to trust it. however that is completely besides the point because we are really talking about improving trust with pypi and npm and the like. sandboxing here is simply not possible because these are mostly libraries to be used for development of larger apps. the approach distributions are using now would be useful here. |
|
|
|
| |
| ▲ | ajross a day ago | parent | prev [-] | | > I don't think the majority of my code was actually reviewed for malware. That's not the model though. Your packages weren't included ab initio, were they? They were included once a Debian packager or whoever decided they were worth including. And how did that happen? Because people were asking for it, already having consumed and contributed and "reviewed" it. Or if they didn't, an upstream dependency of theirs did. The point is that the process of a bunch of experts pulling stuff directly from github in source form and arguing over implementation details and dependency alternatives constitutes review. And quite frankly really good review relative to what you'd get if you asked a "security expert" to look at the code in isolation. It's not that it's impossible to pull one over on the global community of python developers in toto. But it's really fucking hard. | | |
| ▲ | woodruffw a day ago | parent [-] | | > The point is that the process of a bunch of experts pulling stuff directly from github in source form and arguing over implementation details and dependency alternatives constitutes review. And quite frankly really good review relative to what you'd get if you asked a "security expert" to look at the code in isolation. The thing is, I don't think that's what's happening in 2025. I think that might have been what was happening 20 years ago, but I didn't experience any pushback over my (very large) dependency tree when my projects were integrated. Lots of distros took a look at it, walked the tree, rolled everything up, and called it a day. Nobody argued about dependency selection, staleness, relative importance, etc. Nobody has time for that. > It's not that it's impossible to pull one over on the global community of python developers in toto. But it's really fucking hard. I don't think this is true; at the periphery, ~nobody is looking at core dependencies. We can use frequency of "obvious" vulnerabilities in core packages as a proxy for how likely someone would discover an intentional deception: CVE-2024-47081 was in requests for at least a decade before anybody noticed it. Last time I checked, the introduction-to-discovery window for UAF vulnerabilities in Linux itself was still several years. (This is true even in the simplest non-code sense: I maintain a lot of things and have taken over a lot of things, and nobody notices as long as the releases keep coming! This is what the Jia Tan persona recognized.) | | |
| ▲ | ajross 20 hours ago | parent [-] | | That's sort of a double standard, though. No, Debian et. al. aren't perfect and there are ways that serious bugs can and do make it through to production systems. But very, very few of them are malicious exploits. The xz-utils mess last year was a very notable, deliberate attack that took years of planning to get an almost undetectable exploit into a core Linux library. And. It. Failed. Debian caught it. So no. Not perfect. But pretty good, and I trust them and their track record. That's a very different environment than "Here guys, we'll send your code all over the world. But no Russian emails please. Thx." |
|
|
|