| ▲ | vessenes 7 days ago |
| This will always be an issue for the node community - it’s endemic to the JavaScript shipping / speed culture and the package management philosophy. Go is much, much better on these terms, although not perfect. I’d venture a guess that Perl 5 is outstanding here, although it’s been a few years since I tried to run an old Perl project. CPAN was dog slow, but other than that, everything worked first try. I’d also bet Tcl is nearly perfect on the ‘try this 10 year old repo’ test |
|
| ▲ | mst 7 days ago | parent | next [-] |
| CPAN.pm is not the fastest, no, though it generally spends most of its time running each distribution's tests before installing it, which while it does have a certain "start the install and go for lunch" to it is an excellent canary for if something's changed underneath you *before* you end up having to go spelunking your own code. App::cpanminus (cpanm) is noticeably lighter, App::cpm (cpm) does parallel builds and skips tests by default. An approach I've become quite fond of is using cpm to install fast into the local::lib I'm actually going to use, then creating a scratch setup in /home/tmp or similar and running cpanm in that under tmux/abduco/etc. to do a second install that *does* run the tests so I have those results to refer to later but don't have to wait for them right now. (if I ever write a cpan client of my own, it's going to have a mode where it does a cpm-like install process and then backgrounds a test running process that logs somewhere well known so this approach becomes a single command, but I keep getting distracted by other projects ;) |
|
| ▲ | regularfry 7 days ago | parent | prev | next [-] |
| Clojure too, by all accounts. I'd say Common Lisp but they're in the weird position of code itself being rampantly portable across time but the ecosystem around it being astonishingly immature. |
| |
| ▲ | fredrikholm 7 days ago | parent [-] | | Things have improved a lot with the introduction of Quicklisp, but I'd have to agree when compared to others. CL is still one of the nicest languages there is, and the only language that skirts the line between being some combination of dynamic and interpreted yet typed and compiled. It is showing its age though, particularly around the edges like what you're saying. | | |
| ▲ | regularfry 7 days ago | parent [-] | | Quicklisp is a start, and I like that it's a Debian-style distribution because it can at least move towards making the assertion that everything in the distribution is mutually compatible. The problem is that the underlying ecosystem has no culture of saying "version X of library A is only compatible with versions Y+ of library B". You can put that info in the asdf system definition but I don't see many examples of it. The other problem is rate of updates, and that's a symptom of it basically being on one person's shoulders to keep it ticking over. I can't readily think of another major language ecosystem with that characteristic. It just seems really fragmented. | | |
| ▲ | liontwist 7 days ago | parent [-] | | Quicklisp is merely a registry for getting libraries, not shipping code. Download what you want and use asdf. | | |
| ▲ | regularfry 7 days ago | parent [-] | | Yes, that's what makes it immature. There doesn't seem to be anything in the ecosystem to handle version resolution. Qlot is a start towards that in that you can at least specify alternatives to get yourself out of a hole, but you still ended up in the hole in the first place. | | |
| ▲ | liontwist 7 days ago | parent [-] | | I think it’s a good thing. I don’t want npm for Common Lisp. | | |
| ▲ | regularfry 7 days ago | parent [-] | | Which bit is it that you object to? Edit to add: this wasn't intended as a gotcha question, so apologies if it came across as one. I have issues with a lot of details about how npm works and the ecosystem it supports. I think it's possible to avoid them, and aim for something more like a bundler or a cargo, but again there are issues there (certainly in the former's case, I have less experience of the latter). Getting to a good answer that works for CL means understanding both the problem space and the known solutions elsewhere. It might be that "a better quicklisp" is enough? | | |
| ▲ | liontwist 6 days ago | parent [-] | | Sure, I didn’t take that negatively. Let me start with facts: - npm actually downloads multiple copies of each library, when needed to satisfy conflicting version requirements.
- this is only possible due toruntime features of JavaScript. In most languages like C this causes symbol collisions.
- I think this is a problem in Common Lisp too due to packages being global. Maybe there is a fancy way to rebuild packages at load time.
- this is why the Debian style release makes sense. Either everything loads together, or not. Opinions
- I want to know all my dependencies. I treat them as my own source, so tar downloaded is close to my mental model.
- For c projects I usually have a make file with curl commands tied to exactly url. If I want to update I manually change the url.
- quicklisp already has a nice way to make an isolated folder just containing your code and its dependencies to be loaded with asdf. It gets out of the way once you have downloaded your libraries. | | |
| ▲ | regularfry 6 days ago | parent [-] | | Yes, that "feature" of npm isn't something you'll find elsewhere. It's not inherent to the problem of version resolution. Just about anywhere else you'll just get an error message that the version resolution isn't possible, if there's no available combination to satisfy all the requirements. That one design attribute of npm probably more any other feels like they did it because they could, not because it was a particularly good idea. |
|
|
|
|
|
|
|
|
|
| ▲ | skybrian 7 days ago | parent | prev | next [-] |
| Go’s minimum version selection is the way and I don’t understand why other ecosystems haven’t adopted it. You’re be able to compile an old project with all the library dependencies it had at the time it was released. It might have security issues, but at least you start with a version that works and then can go about upgrading it. It also helps that if some library dependency generated Go code using a tool, the Go source code is checked in and you don’t have to run their tool. |
| |
| ▲ | Macha 7 days ago | parent | next [-] | | Getting the exact dependencies it had at release is a solved problem in Node and most other languages with lock files too. It's just no guarantee that those old versions work on the new system, or with the outside world as it exists by time of installation - which can be as true for Go as any other language. If the XYZ service API client still gets you version 1.2.37, that's not actually any help if 1.2.37 calls endpoints that the XYZ service has removed. Or a cgo package that binds to a version of OpenSSL that is no longer installed on your system, etc. | | |
| ▲ | vessenes 7 days ago | parent | next [-] | | This is why I say it's a cultural problem, not a technical problem. In goland, changing API calls in minor versions is pretty much a sin. At least it's something you'd do .. carefully, probably with apologies. In node, it's extremely routine to re-pin to newer modules without worry. | |
| ▲ | tgv 7 days ago | parent | prev | next [-] | | Some time ago, I wanted to update Arch, on a server running some python project I had inherited. Long story short, it relied on something that relied on something that etc., and then it turned out certain components that were needed for the upgrade process had been taken offline. Now the system can’t be changed, unless there’s significant work done to the code, and that’s too expensive. It runs on request in a container now, while it lasts. | | |
| ▲ | baq 7 days ago | parent [-] | | back in the day you were supposed to check in your compiler into version control (not the lockfile, the whole distribution). I used to think that people emailing screenshots of corporate dashboards were idiots. I now think that's actually genius - a frozen in time view which you can't regenerate but will be available until the end of time if you need it. (Hello, Exchange admins!) |
| |
| ▲ | zokier 7 days ago | parent | prev [-] | | My hot take is that lock files and nested dependencies induce fragility. If packages were required to work with wide range of dependencies then that would force the ecosystem to build the packages in more robust way. Basically I think the dependency trees built with modern package managers in a sense over-constrain the environment, making it all sorts of difficult to work with. On the other hand, the other extreme induces stuff like autoconf which is not that great either. Trying to have your code be compatible with absolutely everything is probably not good, although arguably platforms these days are generally much more stable and consistent than they were in the heydays of autoconf. |
| |
| ▲ | vessenes 7 days ago | parent | prev [-] | | I truly think it's just because the engineers that started working with node were ... young. They wanted to rapidly iterate, and so crufty old habits like this weren't what they wanted or felt they needed. What's been interesting is watching these devs age 10 years, and still mostly decide it's better to start new frameworks rather than treat legacy code as an asset. That feels to me like a generational shift. And I'm not shaking my cane and saying they're wrong -- a modern LLM can parse an API document and get you 95% of the way to your goal most of the time pretty quickly -- but I propose it's truly a cultural difference, and I suspect it won't wash out as people age, just create different benefits and costs. |
|
|
| ▲ | zbentley 7 days ago | parent | prev | next [-] |
| I've had a fair amount of trouble with Perl/cpan simply because of the sheer number of XS (compiled C extension) modules in the ecosystem. For even a medium sized perl project that e.g. talks to databases or whatnot, building it after a long time requires you to spend tedious hours getting the right development headers/libraries for the compiled components, fussing with compiler flags, dealing with C ABI symbols that were deprecated in the interim, etc. To be fair, Python and Ruby also have this problem (for newer Pythons, popular extension modules at recent versions are more likely to Just Work due to wheels, but if you're building old code for the first time in 3+ years, all the old problems come back with a vengeance). It's more of a "scripting language that got popular enough that ordinary projects have a deep tree of transitives, many of which are compiled on-site" issue than a Perl specific problem. |
|
| ▲ | cxr 7 days ago | parent | prev | next [-] |
| You're talking about what's wrong with the NPM ecosystem, not JS. Previously: You wouldn't conflate Windows development with "C" (and completely discount UNIX along the way) just because of Win32. <https://news.ycombinator.com/item?id=41899671> |
|
| ▲ | tpm 7 days ago | parent | prev [-] |
| Yeah I'd expect 20yo Perl5 stuff to work without issues. A few weeks ago I was experimenting with a sound generation dsl/runtime called Csound and even most 30yo sources were working as long as they didn't use some obsolete UI. |
| |
| ▲ | transcriptase 7 days ago | parent [-] | | It’s the same with R. The only thing preventing many ancient packages from running under new versions of R and vice-versa is the fact that the package author simply set the minimum version to whatever they happened to be using at the time. |
|