Remix.run Logo
Defletter 3 days ago

Okay, I'll bite: your proposed alternative to being able to specify exact versions of dependencies regardless of operating system or distro that I can statically include into a single binary, everything is project-local, guaranteed, is... what? Is it just "Don't"?

palata 3 days ago | parent [-]

I'm not sure what you mean.

What I am saying is that using a dependency is formalised for build systems. Be it npm, cargo, gradle, meson, cmake, you name it.

In cargo, you add a line to a toml file that says "please fetch this dependency, install it somewhere you understand, and then use if from this somewhere". What is convenient here is that you as a user don't need to know about those steps (how to fetch, how to install, etc). You can use Rust without Cargo and do everything manually if you need to, it's just that cargo comes with the "package manager" part included.

In C/C++, the build systems don't come with the package manager included. It does not mean that there are no package managers. On the contrary, there are tons of them, and the user can choose the one they want to use. Be it the system package manager, a third-party package manager like conan or vcpkg, or doing it manually with a shell/python script. And I do mean the user, not the developer. And because the user may choose the package manager they want, the developer must not interfere otherwise it becomes a pain. Nesting dependencies into your project with git submodules is a way to interfere. As a user, I absolutely hate those projects that actually made extra work to make it hard for me to handle dependencies the way I need.

How do we do that with CMake? By using find_package and/or pkg-config. In your CMakeLists.txt, you should just say `find_package(OpenSSL REQUIRED)` (or whatever it is) and let CMake find it the standard way. If `find_package` doesn't work, you can write a find module (that e.g. uses pkg-config). A valid shortcut IMO is to use pkg-config directly in CMakeLists for very small projects, but find modules are cleaner and actually reusable. CMake will search in a bunch of locations on your system. So if you want to use the system OpenSSL, you're done here, it just works.

If you want to use a library that is not on the system, you still do `find_package(YourLibrary)`, but by default it won't find it (since it's not on the system). In that case, as a user, you configure the CMake project with `CMAKE_PREFIX_PATH`, saying "before you look on the system, please look into these paths I give you". So `cmake -DCMAKE_PREFIX_PATH=/path/where/you/installed/dependencies -Bbuild -S.`. And this will not only just work, but it means that your users can choose the package manager they want (again: system, third-party like conan/vcpkg, or manual)! It also means that your users can choose to use LibreSSL or BoringSSL instead of OpenSSL, because your CMakeLists does not hardcode any of that! Your CMakeLists just says "I depend on those libraries, and I need to find them in the paths that I use for the search".

Whatever you do that makes CMake behave like a package manager (and I include CMake features like the FetchContent stuff) is IMO a mistake, because it won't work with dependencies that don't use CMake, and it will screw (some of) your users eventually. I talk about CMake, but the same applies for other build systems in the C/C++ world.

People then tend to say "yeah I am smart, but my users are stupid and won't know how to install dependencies locally and point CMAKE_PREFIX_PATH to them". To which I answer that you can offer instructions to use a third-party package manager like conan or vcpkg, or even write helper scripts that fetch, build and install the dependencies. Just do not do that inside the CMakeLists, because it will most certainly make it painful for your users who know what they are doing.

Is it simpler than what cargo or npm do? No, definitely not. Is it more flexible, totally. But it is the way it is, and it fucking works. And whoever calls themselves a C/C++ developer and cannot understand how to use the system package manager, or a conan/vcpkg and set CMAKE_PREFIX_PATH need to learn it. I won't say it's incompetence, but it's like being a C++ developer and not understanding how to use a template. It's part of the tools you must learn to use.

People will spend half a day debugging a stupid mistake in their code, but somehow can't apprehend that dealing with a dependency is also part of the job. In C/C++, it's what I explained above. With npm, properly dealing with dependencies means checking the transitive dependencies and being aware of what is being pulled. The only difference is that C/C++ makes it hard to ignore it and lose control over your dependencies, whereas npm calls it a feature and people love it for that.

I don't deny that CMake is not perfect, the syntax is generally weird, and writing find module is annoying. But it is not an excuse to make a mess at every single step of the process. And people who complain about CMake usually write horrible CMakeLists and could benefit from learning how to do it properly. I don't love CMake, I just don't have to complain about it everywhere I can because I can make it work, and it's not that painful.

Defletter 2 days ago | parent [-]

While I do appreciate you taking the time to write that, I am somewhat at a loss. How does this justify the antipathy towards notions of a first-party build system and package manager? That's how we got into this argument with each other: I was calling out C/C++ cultists who cling to the ugly patchwork of hacky tooling that is C/C++'s so-called build systems and decry any notion of a first-party build system (or even a package manager to boot) as being destined to become just like npm.

C/C++ developers clearly want a build system and package manager, hence all this fragmentation, but I can't for the life of me understand why that fragmentation is preferable. For all the concern about supply-chain attacks on npm, why is it preferable that people trust random third-party package managers and their random third-party repackages of libraries (eg: SQLite on conan and vcpkg)? And why is global installation preferable? Have we learnt nothing? There's a reason why Python has venv now; why Maven and Gradle have wrappers; etc. Projects being able to build themselves to a specification without requiring the host machine to reconfigure itself to suit the needs of this one project, is a bonus, not a drawback. Devcontainers should not need to be a thing.

If anything, this just reads like Sunk Cost Fallacy: that "it just works" therefore we needn't be too critical, and anyone who is or who calls for change just needs to git gud. It reminds me of the never-ending war over memory safety: use third-party tools if you must but otherwise just git gud. It's this kind of mindset that has people believing that C/C++'s so-called build systems are just adhering to "there should be some artificial friction when using dependencies to discourage over-use of dependencies", instead of being a Jenga tower of random tools with nothing but gravity holding it all together.

If it were up to me, C/C++ would get a more fleshed-out version of Zig's build system and package manager, ie, something unified, simple, with no central repository, project-local, exact, and explicit. You want SQLite? Just refer to SQLite git repository at a specific commit and the build system will sort it out for you. Granted, it doesn't have an official build.zig so you'll need to write your own, or trust a premade one... but that would also be true if you installed SQLite through conan of vcpkg.

palata 2 days ago | parent [-]

> How does this justify the antipathy towards notions of a first-party build system and package manager?

I don't feel particularly antipathic towards notions of first-party build system and package manager. I find it indeniably better to have a first-party build system instead of the fragmentation that exists in C/C++. On the other hand, I don't feel like asking a 20-year old project to leave autotools just because I asked for it. Or to force people to install Python because I think Meson is cool.

As for the package manager, one issue is security: is it (even partly) curated or not? I could imagine npm offering a curated repo, and a non-curated repo. But there is also a cultural thing there: it is considered normal to have zero control over the dependencies (my this I mean that if the developer has not heard of dependencies they are pulling, then it's not under control). Admittedly it is not a tooling problem, it's a culture problem. Though the tooling allows this culture to be the norm.

When I add a C/C++ dependency to my project, I do my shopping: I go check the projects, I check how mature they are, I look into the codebase, I check who has control over it. Sometimes I will depend on the project, sometimes I will choose to fork it in order to have more control. And of course, if I can get it from the curated list offered by my distro, that's even better.

> C/C++ developers clearly want a build system and package manager, hence all this fragmentation

One thing is legacy: it did not exist before, many tools were created, and now they exist. The fact that the ecosystem had the flexibility to test different things (which surely influenced the modern languages) is great. In a way, having a first-party tool makes it harder to get that. And then there are examples like Swift where is slowly converged towards SwiftPM. But at the time CocoaPods and Carthage were invented, SwiftPM was not a thing.

Also devs want a build system and package manager, but they don't necessarily all want the same one :-). I don't use third-party package managers for instance, instead I build my dependencies manually. Which I find gives me more control, also for cross-compiling. Sometimes I have specific requirements, e.g. when building a Linux distribution (think e.g. Yocto or buildroot). And I don't usually want to depend on Python just for the sake of it, and Conan is a Python tool.

> why is it preferable that people trust random third-party package managers and their random third-party repackages of libraries (eg: SQLite on conan and vcpkg)?

It's not. Trusting a third-party package manager is actually exactly the same as trusting npm. It's more convenient, but less secure. However it's better when you can rely on a curated repository (like what Linux distributions generally provide). Not everything can be curated, but there is a core. Think OpenSSL for instance.

> And why is global installation preferable?

For those dependencies that can be curated, there is a question of security. If all your programs on your system link the same system OpenSSL, then it's super easy to update this OpenSSL when there is a security issue. And in situations where what you ship is a Linux system, then there is no point in not doing it. So there are situations where it is preferable. If everything is statically link and you have a critical fix for a common library, you need to rebuild everything.

> If it were up to me

Sure, if we were to rebuild everything from scratch... well we wouldn't do it in C/C++ in the first place, I'm pretty sure. But my Linux distribution exists, has a lot of merits, and I don't find it very nice when people try to enforce their preferences. I am fine if people want to use Flatpak, cargo, pip, nix, their system package manager, something else, or a mix of all that. But I like being able to install packages on my Gentoo system the way I like, potentially modifying them with a user patch. I like being able to choose if I want to link statically or dynamically (on my Linux, I like to link at least some libraries like OpenSSL dynamically, if I build an Android apk, I like to statically link the dependencies).

And I feel like I am not forcing anyone into doing what I like to do. I actually think that most people should not use Gentoo. I don't prevent anyone from using Flatpak or pulling half the Internet with docker containers for everything. But if they come telling me that my way is crap, I will defend it :-).

> I am somewhat at a loss.

I guess I was not trying to say "C/C++ is great, there is nothing to change". I just think it's not all crap, and I see where it all comes from and why we can't just throw everything away. There are many things to criticise, but many times I feel like criticisms are uninformed and just relying on the fact that everybody does that. Everybody spits on CMake, so it's easy to do it as well. But more often than not, if I start talking to someone who said that they cannot imagine how someone could design something as bad as CMake, they themselves write terrible CMakeLists. Those who can actually use CMake are generally a lot more nuanced.

Defletter 6 hours ago | parent [-]

Even though I understand why you prefer that, I feel like you're painting too rosy of an image. To quote Tom Delalande: "There are some projects where if it was 10% harder to write the code, the project would fail." I believe this deeply and that this is also true for the build system: your build config should not be rivalling your source code in terms of length. That's hyperbole in most cases, sure, and may well indicate badly written build configs, but writing build configs should not be a skill issue. I am willing to bet that Rust has risen so much in popularity not just because of its memory safety, but also because of its build system. I don't like CMake, but I also don't envy its position.