| ▲ | rep_lodsb 20 hours ago |
| So every Linux distribution should compile and distribute packages for every single piece of open source software in existence, both the very newest stuff that was only released last week, and also everything from 30+ years ago, no matter how obscure. Because almost certainly someone out there will want to use it. And they should be able to, because that is the entire point of free software: user freedom. |
|
| ▲ | kwanbix 17 hours ago | parent | next [-] |
| I am not an expert on this, but my question is, how does windows manages to achieve it? Why can't Linux do the same? |
| |
| ▲ | johnny22 11 hours ago | parent [-] | | because they care about ABI/API stability. | | |
| ▲ | nineteen999 5 hours ago | parent [-] | | And have an ever decreasing market share, in desktop, hypervisor and server space. The API/ABI stability is probably the only thing stemming the customer leakage at all. It's not the be all and end all. | | |
|
|
|
| ▲ | rixed 9 hours ago | parent | prev | next [-] |
| Those users will either check the source code and compile it themself, with all the proper options to match their system; or rely on a software distribution to do it for them. People who are complaining would prefer a world of isolated apps downloaded from signed stores, but Linux was born at an optimistic time when the goal was software that cooperate and form a system, and which distribution does not depend on a central trusted platform. I do not believe that there is any real technical issue discussed here, just drastically different goals. |
| |
| ▲ | ogogmad 3 hours ago | parent [-] | | No. People would prefer the equivalent of double-click `setup.exe`. Were you being serious? |
|
|
| ▲ | kccqzy 16 hours ago | parent | prev | next [-] |
| Your tone makes it sound like this is a bad thing. But from a user’s perspective, I do want a distro to package as much software as possible. And it has nothing to do with user freedom. It’s all about being entitled as a user to have the world’s software conveniently packaged. |
| |
| ▲ | Rohansi 15 hours ago | parent | next [-] | | Software installed from your package manager is almost certainly provided as a binary already. You could package a .exe file and that should work everywhere WINE is installed. | | |
| ▲ | kccqzy 10 hours ago | parent [-] | | That's not my point. My point is that if executable A depends on library B, and library B does not provide any stable ABI, then the package manager will take care of updating A whenever updating B. Windows has fanatical commitment to ABI stability, so the situation above does not even occur. As a user, all the hard work dealing with ABI breakages on Linux are done by the people managing the software repos, not by the user or by the developer. I'm personally very appreciative of this fact. | | |
| ▲ | Rohansi 3 hours ago | parent [-] | | Sure, it's better than nothing, but it's certainly not ideal. How much time and energy is being wasted by libraries like that? Wouldn't it be better if library B had a stable ABI or was versioned? Is there any reason it needs to work like this? |
|
| |
| ▲ | grishka 13 hours ago | parent | prev [-] | | What if you want to use a newer or older version of just one package without having to update or downgrade the entire goddamn universe? What if you need to use proprietary software? I've had so much trouble with package managers that I'm not even sure they are a good idea to begin with. | | |
| ▲ | prmoustache 2 hours ago | parent | next [-] | | That is the point of flatpak or appimage but even before that you could do it by shipping the libraries with your software and use LD_LIBRARY_PATH to link your software to them. That was what most well packaged proprietary software used to do when installing into /opt. | |
| ▲ | Maskawanian 12 hours ago | parent | prev [-] | | I know you are trying to make a point about complexity, but that is literally what NixOS allows for. | | |
|
|
|
| ▲ | realusername 19 hours ago | parent | prev [-] |
| Not sure if it's the right solution but it's a description of what happens right now in practice yes. |
| |
| ▲ | bruce511 18 hours ago | parent [-] | | It also makes support more or less impossible. Even if we ship as source, even if the user has the skills to build it, even if the make file supports every version of the kernel, plus all other material variety, plus who knows how many dependencies, what exactly am I supposed to do when a user reports; "I followed your instructions and it doesn't run". Linux Desktop fails because it's not 1 thing, it's 100 things. And to get anything to run reliably on 95 of them you need to be extremely competent. Distribution as source fails because there are too many unknown, and dependent parts. Distribution as binary containers (Docker et al) are popular because it gives the app a fighting chance. While at the same time being a really ugly hack. | | |
| ▲ | tuna74 6 hours ago | parent | next [-] | | Then you only support 1 distro. If anyone wants to use your software on an unsupported distro they can figure out the rest themselves. | |
| ▲ | josephg 17 hours ago | parent | prev [-] | | Yep. But docker doesn’t help you with desktop apps. And everything becomes so big! I think Rob pike has the right idea with go just statically link everything wherever possible. These days I try to do the same, because so much less can go wrong for users. People don’t seem to mind downloading a 30mb executable, so long as it actually works. |
|
|