| |
| ▲ | menaerus 5 days ago | parent | next [-] | | For most people this is a feature not a bug as you suggest. It may come across as PITA, and for many people will do, but as far as I am concerned, while also having experienced the pain of package managers in C++, this is the right way. In the end it's always about the trade-offs. And all the (large) codebases that used conan, bazel or vcpkg induced a magnitude more issues that you would have to handle which otherwise in a plain CMake you would not have. Package managers are for convenience but not all projects can afford themselves the trouble this convenience brings with it. | | |
| ▲ | soanvig 4 days ago | parent | next [-] | | Coming from a different ground (TypeScript) I agree, in a sense that there is a line where apparent convenience because a trouble. JS ecosystem is known for its hype for build tools. Long term all of them become a problem due to trying to be more convenient, leading to more and more abstractions and hidden behaviors, which turns into a mess impossible to debug or solve when user diverges from author's happy path. Thus I promote using only the necessities, and gluing them together by yourself. Even if something doesn't work, at least it can be tracked down and solved. | |
| ▲ | palata 4 days ago | parent | prev [-] | | > For most people this is a feature not a bug as you suggest. Exactly: it makes many things nicer to use than the language package managers, e.g. when maintaining a Linux distribution. But people generally don't know how one maintains a Linux distribution, so they can't really see the use-case, I guess. |
| |
| ▲ | palata 4 days ago | parent | prev | next [-] | | > require learning and teaching magic incantations at the command line That's exactly my point: if you think that calling `cmake --build build` is "magic", then maybe you don't have the right profile to use C++ in the first place, because you will have to learn some harder concepts there (like... pointers). To be honest, I find it hard to understand how a software developer can write code and still consider that command line instructions are "magic incantations". To me it's like saying that calling a function like `println("Some text, {}, {}", some_parameter, some_other_parameter)` is a "magic incantation". Calling a function with parameters counts as "the basics" to me. | |
| ▲ | bluGill 5 days ago | parent | prev | next [-] | | that idea that packages and builds belongs to simple problem, large projects need things like more than one laguage and so end up fighting the language | | |
| ▲ | duped 5 days ago | parent [-] | | Every modern language seems to have an answer to this problem that C and C++ refuse to touch because it's out of scope for their respective committees and standards orgs | | |
| ▲ | nothrabannosir 5 days ago | parent | next [-] | | On the front page right now: Shai-Hulud malware attack: Tinycolor and over 40 NPM packages compromised (stepsecurity.io) 935 points by jamesberthoty 16 hours ago | flag | hide | 730 comments Maybe obstreperous dependency management ends up being the winning play in 2025 :) | | |
| ▲ | asa400 5 days ago | parent | next [-] | | Just think of how many _more_ vulns C and C++ could be responsible for if they had package modern managers! :) | |
| ▲ | Defletter 5 days ago | parent | prev | next [-] | | Seems like a false dichotomy | |
| ▲ | duped 4 days ago | parent | prev [-] | | Completely unrelated. |
| |
| ▲ | pclmulqdq 5 days ago | parent | prev | next [-] | | C++ has a plethora of available build and package management systems. They just aren't bundled with the compiler. IMO that is a good thing, because it keeps the compiler writers honest. | | |
| ▲ | jimbob45 5 days ago | parent [-] | | You say that as if Cargo, MSBuild, and pip aren’t massively loved by their communities. | | |
| ▲ | jcelerier 5 days ago | parent | next [-] | | Coming from c++, pip and python dependency management is the bane of my life. How do you make a python software leveraging pytorch that will ship as a single .exe and be able to target whatever gpu the user has without downloads? | | |
| ▲ | jononor 4 days ago | parent | next [-] | | Funnily enough a lot of the challenges in this particular case is related to PyTorch and CUDA being native code (mostly in C++). Of course combined with the fact that pip is not really adequate as a native/C++ code package manager. Perhaps if C++ had a decent standardized package manager, the Python package system reuse that? ;p | |
| ▲ | gpderetta 4 days ago | parent | prev [-] | | just wait for next week and python will get a better package manager! |
| |
| ▲ | pclmulqdq 5 days ago | parent | prev | next [-] | | "Massively loved" and "good decision" are orthogonal axes. See the current npm drama. People love wantonly importing dependencies the way they love drinking. Both feel great but neither is good for you. | | |
| ▲ | asa400 5 days ago | parent | next [-] | | Not that npm-style package management is the best we can do or anything, but I would be more sympathetic to this argument if C or C++ had a clearly better security story than JS, Python, etc. (pick your poison), but they're also disasters in this area. What happens in practice is people end up writing their own insecure code instead of using someone else's insecure code. Of course, we can debate the tradeoffs of one or the other! | | |
| ▲ | bluGill 4 days ago | parent [-] | | This isn't only about security. This is about interoperability, in the real world we mix (and should mix!) C, C++, Rust, python.... In the real world lawyers audit every dependency to ensure they can legally use it. In the real world we are responsible for our dependencies and so need to audit the code. |
| |
| ▲ | Defletter 5 days ago | parent | prev [-] | | I'm getting the impression that C/C++ cultists love it whenever there's an npm exploit because then they can gleefully point at it and pretend that any first-party package manager for C/C++ would inevitably result in the same, nevermind the other languages that do not have this issue, or have it to a far, far lesser extent. Do these cultists just not use dependencies? Are they just [probably inexpertly] reinventing every wheel? Or do they use system packages like that's any better *cough* AUR exploits *cought*. While dependency hell on nodejs (and even Rust if we're honest) is certainly a concern, it's npm's permissiveness and lack of auditing that's the real problem. That's why Debian is so praised. | | |
| ▲ | pclmulqdq 4 days ago | parent | next [-] | | What makes me a C++ "cultist"? I like the language, but I don't think it's a cult. And yes, they do implement their own wheel all the time (usually expertly) because libraries are reserved for functions that really need it: writing left pad is really easy. They also use third-party libraries all the time, too. They just generally pay attention to the source of that library. Google and Facebook also publish a lot of C++ libraries under one umbrella (abseil and folly respectively), and people often use one of them. | |
| ▲ | bluGill 4 days ago | parent | prev | next [-] | | STOP SAYING CULTIST! The word has very strong meaning and does not apply to anyone working with C or C++. I take offense at being called a cultist just because I say C++ is not nearly as bad as the haters keep claiming it is - as well I should. | |
| ▲ | palata 4 days ago | parent | prev [-] | | > Or do they use system packages like that's any better cough AUR exploits cought. AUR stands for "Arch User Repository". It's not the official system repository. > I'm getting the impression that C/C++ cultists love it whenever there's an npm exploit I am not a C/C++ cultist at all, and I actually don't like C++ (the language) so much (I've worked with it for years). I, for one, do not love it when there is an exploit in a language package manager. My problem with language package managers is that people love them precisely because they don't want to learn how to deal with dependencies. Which is actually the problem: if I pull a random Rust library, it will itself pull many transitive dependencies. I recently compared two implementations of the same standard (C++ vs Rust): in C++ it had 8 dependencies (I can audit that myself). In Rust... it had 260 of them. 260! I won't even read through all those names. "It's too hard to add a dependency in C++" is, in my opinion, missing the point. In C++, you have to actually deal with the dependency. You know it exists, you have seen it at least once in your life. The fact that you can't easily pull 260 dependencies you have never heard about is a feature, not a bug. I would be totally fine with great tooling like cargo, if it looked like the problem of random third-party dependencies was under control. But it is not. Not remotely. > Do these cultists just not use dependencies? I choose my dependencies carefully. If I need a couple functions from an open source dependency I don't know, I can often just pull those two functions and maintain them myself (instead of pulling the dependency and its 10 dependencies). > Are they just [probably inexpertly] reinventing every wheel? I find it ironic that when I explain that my problem is that I want to be able to audit (and maintain, if necessary) my dependencies, the answer that comes suggests that I am incompetent and "inexpertly" doing my job. Would it make me more of an expert if I was pulling, running and distributing random code from the Internet without having the smallest clue about who wrote it? Do I need to complain about how hard CMake is and compare a command line to a "magic incantation" to be considered an expert? | | |
| ▲ | Defletter 4 days ago | parent [-] | | > AUR stands for "Arch User Repository". It's not the official system repository. Okay... and? The point being made was that the issue of package managers remains: do you really think users are auditing all those "lib<slam-head-on-keyboard>" dependencies that they're forced to install? Whether they install those dependencies from the official repository or from homebrew, or nix, or AUR, or whatever, is immaterial, the developer washed their hands of this, instead leaving it to the user who in all likelihood knows significantly less than the developers to be able to make an informed decision, so they YOLO it. Third-party repositories would not exist if they had no utility. But this is why Debian is so revered: they understand this dynamic and so maintain repositories that can be trusted. Whereas the solution C/C++ cultists seem to implicitly prefer is having no repositories because dependencies are, at best, a slippery slope. > "It's too hard to add a dependency in C++" It's not hard to add a dependency. I actually prefer the dependencies-as-git-submodules approach to package managers: it's explicit and you know what you're getting and from where. But using those dependencies is a different story altogether. Don't you just love it when one or more of your dependencies has a completely different build system to the others? So now you have to start building dependencies independently, whose artefacts are in different places, etc, etc, this shouldn't be a problem. > I, for one, do not love it when there is an exploit in a language package manager. Oh please, I believe that about as much as ambulance chasers saying they don't love medical emergencies. Otherwise, why are any and all comments begging for a first-party package manager immediately swamped with strawmans about npm as if anyone is actually asking for that, instead of, say, what Zig or Go has? It's because of the cultism, and every npm exploit further entrenches it. | | |
| ▲ | pclmulqdq 4 days ago | parent | next [-] | | C++ usage has nothing to do with static/dynamic linking. One is a language and the other is a way of using libraries. Dynamic linking gives you small binaries with a lot of cross-compatibility, and static linking gives you big binaries with known function. Most production C++ out there follows the same pattern as Rust and Go and uses static linking (where do you think Rust and Go got that pattern from?). Python is a weird language that has tons of dynamic linking while also having a big package manager, which is why pip is hell to use and PyTorch is infamously hard to install. Dynamic linking shifts responsibility for the linked libraries over to the user and their OS, and if it's an Arch user using AUR they are likely very interested in assuming that risk for themselves. 99.9% of Linux users are using Debian or Ubuntu with apt for all these libs, and those maintainers do pay a lot of attention to libraries. | |
| ▲ | palata 4 days ago | parent | prev [-] | | > But this is why Debian is so revered: they understand this dynamic and so maintain repositories that can be trusted. So you do understand my point about AUR. AUR is like adding a third-party repo to your Debian configuration. So it's not a good example if you want to talk about official repositories. Debian is a good example (it's not the only distribution that has that concept), which proves my point and not yours: this is better than unchecked repositories in terms of security. > Whereas the solution C/C++ cultists seem to implicitly prefer is having no repositories because dependencies are, at best, a slippery slope. Nobody says that ever. Either you make up your cult just to win an argument, or you don't understand what C/C++ people say. The whole goddamn point is to have a trusted system repository, and if you need to pull something that is not there, then you do it properly. Which is better than pulling random stuff from random repositories, again. > I actually prefer the dependencies-as-git-submodules approach Oh right. So you do it wrong, it's good to know and it will answer your next complaint: > Don't you just love it when one or more of your dependencies has a completely different build system to the others I don't give a damn because I handle dependencies properly (not as git submodules). I don't have a single project where the dependencies all use the same build system. It's just not a problem at all, because I do it properly. What do I do then? Well exactly the same as what your system package manager does. > this shouldn't be a problem. I agree with you. Call it a footgun if you wish, you are the one pulling the trigger. It isn't a problem for me. > why are any and all comments begging for a first-party package manager immediately swamped with strawmans about npm Where did I do that? > It's because of the cultism, and every npm exploit further entrenches it. It's because npm is a good example of what happens when it goes out of control. Pip has the same problem, and Rust as well. But npm seems to be the worse, I guess because it's used by more people? | | |
| ▲ | Defletter 4 days ago | parent [-] | | Your defensiveness is completely hindering you and I cannot be bothered with that so here are some much needed clarifications: > I am not a C/C++ cultist at all, and I actually don't like C++ (the language) so much (I've worked with it for years). I, for one, do not love it when there is an exploit in a language package manager. If you do neither of those things then did it ever occur to you that this might not be about YOU? > I find it ironic that when I explain that my problem is that I want to be able to audit (and maintain, if necessary) my dependencies, the answer that comes suggests that I am incompetent and "inexpertly" doing my job. Yeah, hi, no you didn't explain that. You're probably mistaking me for someone else in some other conversation you had. The only comment of yours prior to mine in the thread is you saying "I can use pkg-config just fine." And again, you're thinking that I'm calling YOU incompetent, or even that I'm calling you incompetent. But okay, I'm sure your code never has bugs, never has memory issues, is never poorly designed or untested, that you can whip out an OpenGL alternative whatever in no time and it be just as stable and battle-tested, and to say otherwise must be calling you incompetent. That makes total sense. > AUR stands for "Arch User Repository". It's not the official system repository. > So it's not a good example if you want to talk about official repositories. I said system package, not official repository. I don't know why you keep insisting on countering an argument I did not make. Yes, system packages can be installed from unofficial repositories. I don't know how I could've made this clearer. -- Overall, getting bored of this, though the part where you harp on about doing dependencies properly compared to me and not elaborating one bit is very funny. Have a nice day. | | |
| ▲ | palata 4 days ago | parent [-] | | > Your defensiveness Start by not calling everybody disagreeing with you a cultist, next time. > I said system package, not official repository. I don't know why you keep insisting on countering an argument I did not make. Yes, system packages can be installed from unofficial repositories. I don't know how I could've made this clearer. It's not that it is unclear, it's just that it doesn't make sense. When we compare npm to a system package manager in this context, the thing we compare is whether or not is it curated. Agreed, I was maybe not using the right words (I should have said curated package managers vs not curated package managers), but it did not occur to me that it was unclear because comparing npm to a system package manager makes no sense otherwise. It's all just installing binaries somewhere on disk. AUR is much like npm in that it is not curated. So if you find that it is a security problem: great! We agree! If you want to pull something from AUR, you should read its PKGBUILD first. And if it pulls tens of packages from AUR, you should think twice before you actually install it. Just like if someone tells you to do `curl https://some_website.com/some_script.sh | sudo sh`, no matter how convenient that is. Most Linux distributions have a curated repository, which is the default for the "system package manager". Obviously, if users add custom, not curated repositories, it's a security problem. AUR is a bad example because it isn't different from npm in that regard. > though the part where you harp on about doing dependencies properly compared to me and not elaborating one bit is very funny Well I did elaborate at least one bit, but I doubt you are interested in more details than what I wrote: "What do I do then? Well exactly the same as what your system package manager does." I install the dependencies somewhere (just like the system package manager does), and I let my build system find them. It could be with CMake's `find_package`, it could be with pkg-config, whatever knows how to find packages. There is no need to install the dependencies in the place where the system package manager installs stuff: it can go anywhere you want. And you just tell CMake or pkg-config or Meson or whatever you use to look there, too. Using git submodules is just a bad idea for many reasons, including the fact that you need all of them to use the same build system (which you mentioned), or that a clean build usually implies rebuilding the dependencies (for nothing) or that it doesn't work with package managers (system or not). And usually, projects that use git submodule only support that, without offering a way to use the system package(s). | | |
| ▲ | Defletter 4 days ago | parent [-] | | > Start by not calling everybody disagreeing with you a cultist, next time. You'd do very well as a culture war pundit. Clearly I wasn't describing a particular kind of person, no, I'm clearly I'm just talking about everyone I disagree with /s | | |
| ▲ | palata 3 days ago | parent [-] | | So, not interested at all in how to deal with dependencies without git submodules, I reckon? We can stop here indeed. | | |
| ▲ | Defletter 3 days ago | parent [-] | | You misunderstand, I am already well aware. My comment about your lack of elaboration was not due to any ignorance on my part, but rather to point out how you assumed that and refused to elaborate anyway. The idea that I may have my reasons for preferring dependencies-as-git-submodules or their equivalents (like Zig's package system) never crossed your mind. Can't say I'm surprised. Oh well. | | |
| ▲ | palata 3 days ago | parent [-] | | > The idea that I may have my reasons for preferring dependencies-as-git-submodules Well, git submodules are strictly inferior and you know it: you even complained about the fact that it is a pain when some dependencies use different build systems. You choose a solution that does not work, and then you blame the tools. | | |
| ▲ | Defletter 3 days ago | parent [-] | | Okay, I'll bite: your proposed alternative to being able to specify exact versions of dependencies regardless of operating system or distro that I can statically include into a single binary, everything is project-local, guaranteed, is... what? Is it just "Don't"? | | |
| ▲ | palata 3 days ago | parent [-] | | I'm not sure what you mean. What I am saying is that using a dependency is formalised for build systems. Be it npm, cargo, gradle, meson, cmake, you name it. In cargo, you add a line to a toml file that says "please fetch this dependency, install it somewhere you understand, and then use if from this somewhere". What is convenient here is that you as a user don't need to know about those steps (how to fetch, how to install, etc). You can use Rust without Cargo and do everything manually if you need to, it's just that cargo comes with the "package manager" part included. In C/C++, the build systems don't come with the package manager included. It does not mean that there are no package managers. On the contrary, there are tons of them, and the user can choose the one they want to use. Be it the system package manager, a third-party package manager like conan or vcpkg, or doing it manually with a shell/python script. And I do mean the user, not the developer. And because the user may choose the package manager they want, the developer must not interfere otherwise it becomes a pain. Nesting dependencies into your project with git submodules is a way to interfere. As a user, I absolutely hate those projects that actually made extra work to make it hard for me to handle dependencies the way I need. How do we do that with CMake? By using find_package and/or pkg-config. In your CMakeLists.txt, you should just say `find_package(OpenSSL REQUIRED)` (or whatever it is) and let CMake find it the standard way. If `find_package` doesn't work, you can write a find module (that e.g. uses pkg-config). A valid shortcut IMO is to use pkg-config directly in CMakeLists for very small projects, but find modules are cleaner and actually reusable. CMake will search in a bunch of locations on your system. So if you want to use the system OpenSSL, you're done here, it just works. If you want to use a library that is not on the system, you still do `find_package(YourLibrary)`, but by default it won't find it (since it's not on the system). In that case, as a user, you configure the CMake project with `CMAKE_PREFIX_PATH`, saying "before you look on the system, please look into these paths I give you". So `cmake -DCMAKE_PREFIX_PATH=/path/where/you/installed/dependencies -Bbuild -S.`. And this will not only just work, but it means that your users can choose the package manager they want (again: system, third-party like conan/vcpkg, or manual)! It also means that your users can choose to use LibreSSL or BoringSSL instead of OpenSSL, because your CMakeLists does not hardcode any of that! Your CMakeLists just says "I depend on those libraries, and I need to find them in the paths that I use for the search". Whatever you do that makes CMake behave like a package manager (and I include CMake features like the FetchContent stuff) is IMO a mistake, because it won't work with dependencies that don't use CMake, and it will screw (some of) your users eventually. I talk about CMake, but the same applies for other build systems in the C/C++ world. People then tend to say "yeah I am smart, but my users are stupid and won't know how to install dependencies locally and point CMAKE_PREFIX_PATH to them". To which I answer that you can offer instructions to use a third-party package manager like conan or vcpkg, or even write helper scripts that fetch, build and install the dependencies. Just do not do that inside the CMakeLists, because it will most certainly make it painful for your users who know what they are doing. Is it simpler than what cargo or npm do? No, definitely not. Is it more flexible, totally. But it is the way it is, and it fucking works. And whoever calls themselves a C/C++ developer and cannot understand how to use the system package manager, or a conan/vcpkg and set CMAKE_PREFIX_PATH need to learn it. I won't say it's incompetence, but it's like being a C++ developer and not understanding how to use a template. It's part of the tools you must learn to use. People will spend half a day debugging a stupid mistake in their code, but somehow can't apprehend that dealing with a dependency is also part of the job. In C/C++, it's what I explained above. With npm, properly dealing with dependencies means checking the transitive dependencies and being aware of what is being pulled. The only difference is that C/C++ makes it hard to ignore it and lose control over your dependencies, whereas npm calls it a feature and people love it for that. I don't deny that CMake is not perfect, the syntax is generally weird, and writing find module is annoying. But it is not an excuse to make a mess at every single step of the process. And people who complain about CMake usually write horrible CMakeLists and could benefit from learning how to do it properly. I don't love CMake, I just don't have to complain about it everywhere I can because I can make it work, and it's not that painful. | | |
| ▲ | Defletter 2 days ago | parent [-] | | While I do appreciate you taking the time to write that, I am somewhat at a loss. How does this justify the antipathy towards notions of a first-party build system and package manager? That's how we got into this argument with each other: I was calling out C/C++ cultists who cling to the ugly patchwork of hacky tooling that is C/C++'s so-called build systems and decry any notion of a first-party build system (or even a package manager to boot) as being destined to become just like npm. C/C++ developers clearly want a build system and package manager, hence all this fragmentation, but I can't for the life of me understand why that fragmentation is preferable. For all the concern about supply-chain attacks on npm, why is it preferable that people trust random third-party package managers and their random third-party repackages of libraries (eg: SQLite on conan and vcpkg)? And why is global installation preferable? Have we learnt nothing? There's a reason why Python has venv now; why Maven and Gradle have wrappers; etc. Projects being able to build themselves to a specification without requiring the host machine to reconfigure itself to suit the needs of this one project, is a bonus, not a drawback. Devcontainers should not need to be a thing. If anything, this just reads like Sunk Cost Fallacy: that "it just works" therefore we needn't be too critical, and anyone who is or who calls for change just needs to git gud. It reminds me of the never-ending war over memory safety: use third-party tools if you must but otherwise just git gud. It's this kind of mindset that has people believing that C/C++'s so-called build systems are just adhering to "there should be some artificial friction when using dependencies to discourage over-use of dependencies", instead of being a Jenga tower of random tools with nothing but gravity holding it all together. If it were up to me, C/C++ would get a more fleshed-out version of Zig's build system and package manager, ie, something unified, simple, with no central repository, project-local, exact, and explicit. You want SQLite? Just refer to SQLite git repository at a specific commit and the build system will sort it out for you. Granted, it doesn't have an official build.zig so you'll need to write your own, or trust a premade one... but that would also be true if you installed SQLite through conan of vcpkg. | | |
| ▲ | palata 2 days ago | parent [-] | | > How does this justify the antipathy towards notions of a first-party build system and package manager? I don't feel particularly antipathic towards notions of first-party build system and package manager. I find it indeniably better to have a first-party build system instead of the fragmentation that exists in C/C++. On the other hand, I don't feel like asking a 20-year old project to leave autotools just because I asked for it. Or to force people to install Python because I think Meson is cool. As for the package manager, one issue is security: is it (even partly) curated or not? I could imagine npm offering a curated repo, and a non-curated repo. But there is also a cultural thing there: it is considered normal to have zero control over the dependencies (my this I mean that if the developer has not heard of dependencies they are pulling, then it's not under control). Admittedly it is not a tooling problem, it's a culture problem. Though the tooling allows this culture to be the norm. When I add a C/C++ dependency to my project, I do my shopping: I go check the projects, I check how mature they are, I look into the codebase, I check who has control over it. Sometimes I will depend on the project, sometimes I will choose to fork it in order to have more control. And of course, if I can get it from the curated list offered by my distro, that's even better. > C/C++ developers clearly want a build system and package manager, hence all this fragmentation One thing is legacy: it did not exist before, many tools were created, and now they exist. The fact that the ecosystem had the flexibility to test different things (which surely influenced the modern languages) is great. In a way, having a first-party tool makes it harder to get that. And then there are examples like Swift where is slowly converged towards SwiftPM. But at the time CocoaPods and Carthage were invented, SwiftPM was not a thing. Also devs want a build system and package manager, but they don't necessarily all want the same one :-). I don't use third-party package managers for instance, instead I build my dependencies manually. Which I find gives me more control, also for cross-compiling. Sometimes I have specific requirements, e.g. when building a Linux distribution (think e.g. Yocto or buildroot). And I don't usually want to depend on Python just for the sake of it, and Conan is a Python tool. > why is it preferable that people trust random third-party package managers and their random third-party repackages of libraries (eg: SQLite on conan and vcpkg)? It's not. Trusting a third-party package manager is actually exactly the same as trusting npm. It's more convenient, but less secure. However it's better when you can rely on a curated repository (like what Linux distributions generally provide). Not everything can be curated, but there is a core. Think OpenSSL for instance. > And why is global installation preferable? For those dependencies that can be curated, there is a question of security. If all your programs on your system link the same system OpenSSL, then it's super easy to update this OpenSSL when there is a security issue. And in situations where what you ship is a Linux system, then there is no point in not doing it. So there are situations where it is preferable. If everything is statically link and you have a critical fix for a common library, you need to rebuild everything. > If it were up to me Sure, if we were to rebuild everything from scratch... well we wouldn't do it in C/C++ in the first place, I'm pretty sure. But my Linux distribution exists, has a lot of merits, and I don't find it very nice when people try to enforce their preferences. I am fine if people want to use Flatpak, cargo, pip, nix, their system package manager, something else, or a mix of all that. But I like being able to install packages on my Gentoo system the way I like, potentially modifying them with a user patch. I like being able to choose if I want to link statically or dynamically (on my Linux, I like to link at least some libraries like OpenSSL dynamically, if I build an Android apk, I like to statically link the dependencies). And I feel like I am not forcing anyone into doing what I like to do. I actually think that most people should not use Gentoo. I don't prevent anyone from using Flatpak or pulling half the Internet with docker containers for everything. But if they come telling me that my way is crap, I will defend it :-). > I am somewhat at a loss. I guess I was not trying to say "C/C++ is great, there is nothing to change". I just think it's not all crap, and I see where it all comes from and why we can't just throw everything away. There are many things to criticise, but many times I feel like criticisms are uninformed and just relying on the fact that everybody does that. Everybody spits on CMake, so it's easy to do it as well. But more often than not, if I start talking to someone who said that they cannot imagine how someone could design something as bad as CMake, they themselves write terrible CMakeLists. Those who can actually use CMake are generally a lot more nuanced. | | |
| ▲ | Defletter 6 hours ago | parent [-] | | Even though I understand why you prefer that, I feel like you're painting too rosy of an image. To quote Tom Delalande: "There are some projects where if it was 10% harder to write the code, the project would fail." I believe this deeply and that this is also true for the build system: your build config should not be rivalling your source code in terms of length. That's hyperbole in most cases, sure, and may well indicate badly written build configs, but writing build configs should not be a skill issue. I am willing to bet that Rust has risen so much in popularity not just because of its memory safety, but also because of its build system. I don't like CMake, but I also don't envy its position. |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
| ▲ | palata 5 days ago | parent | prev | next [-] | | They are massively loved because people don't want to learn how it works. But the result is that people massively don't understand how package management works, and miss the real cost of dependencies. | |
| ▲ | pjmlp 4 days ago | parent | prev [-] | | MSBuild also does C++. |
|
| |
| ▲ | palata 5 days ago | parent | prev | next [-] | | Modern languages don't generally play nice with linux distributions, IMO. C and C++ have an answer to the dependency problem, you just have to learn how to do it. It's not rocket science, but you have to learn something. Modern languages remove this barrier, so that people who don't want to learn can still produce stuff. Good for them. | |
| ▲ | bluGill 4 days ago | parent | prev [-] | | no they don't - at least not a good answer. It generally amounts to running a different build system and waiting - this destroys parralism and slows the build down. |
|
| |
| ▲ | jchw 5 days ago | parent | prev [-] | | I'm not going to defend the fact that the C++ devex sucks. There are really a lot of reasons for it, some of which can't sensibly be blamed on the language and some of which absolutely can be. (Most of it probably just comes down to the language and tooling being really old and not having changed in some specific fundamental ways.) However, it's definitely wrong to say that the typical tools are "non-portable". The UNIX-style C++ toolchains work basically anywhere, including Windows, although I admit some of the tools require MSys/Cygwin. You can definitely use GNU Makefiles with pkg-config using MSys2 and have a fine experience. Needless to say, this also works on Linux, macOS, FreeBSD, Solaris, etc. More modern tooling like CMake and Ninja work perfectly fine on Windows and don't need any special environment like Cygwin or MSys, can use your MSVC installation just fine. I don't really think applying the mantra of Rust package management and build processes to C++ is a good idea. C++'s toolchain is amenable to many things that Rust and Cargo aren't. Instead, it'd be better to talk about why C++ sucks to use, and then try to figure out what steps could be taken to make it suck less. Like: - Building C++ software is hard. There's no canonical build system, and many build systems are arcane. This one really might be a tough nut to crack. The issue is that creating yet another system is bound to just cause xkcd 927. As it is, there are many popular ways to build, including GNU Make, GNU Autotools + Make, Meson, CMake, Visual Studio Solutions, etc. CMake is the most obvious winner right now. It has achieved defacto standard support. It works on basically any operating system, and IDEs like CLion and Visual Studio 2022 have robust support for CMake projects. Most importantly, building with CMake couldn't be much simpler. It looks like this: $ cmake -B .build -S .
...
$ cmake --build .build
...
And you have a build in .build. I think this is acceptable. (A one-step build would be simpler, but this is definitely more flexible, I think it is very passable.)This does require learning CMake, and CMake lists files are definitely a bit ugly and sometimes confusing. Still, they are pretty practical, and rather easy to get started with, so I think it's a clear win. CMake is the "defacto" way to go here. - Managing dependencies in C++ is hard. Sometimes you want external dependencies, sometimes you want vendored dependencies. This problem's even worse. CMake helps a little here, because it has really robust mechanisms for finding external dependencies. However, while robust, the mechanism is definitely a bit arcane; it has two modes, the legacy Find scripts mode, and the newer Config mode, and some things like version constraints can have strange and surprising behavior (it differs on a lot of factors!) But sometimes you don't want to use external dependencies, like on Windows, where it just doesn't make sense. What can do you really do here? I think the most obvious thing to do is use vcpkg. As the name implies, it's Microsoft's solution to source-level dependencies. Using vcpkg with Visual Studio and CMake is relatively easy, and it can be configured with a couple of JSON files (and there is a simple CLI that you can use to add/remove dependencies, etc.) When you configure your CMake build, your dependencies will be fetched and built appropriately for your targets, and then CMake's find package mechanism can be used just as it is used for external dependencies. CMake itself is also capable of vendoring projects within itself, and it's absolutely possible to support all three modalities of manual vendoring, vcpkg, and external dependencies. However, for obvious reasons this is generally not advisable. It's really complicated to write CMake scripts that actually work properly in every possible case, and many cases need to be prevented because they won't actually work. All of that considered, I think the best existing solution here is CMake + vcpkg. When using external dependencies is desired, simply not using vcpkg is sufficient and the external dependencies will be picked up as long as they are installed. This gives an experience much closer to what you'd expect from a modern toolchain, but without limiting you from using external dependencies which is often unavoidable in C++ (especially on Linux.) - Cross-compiling with C++ is hard. In my opinion this is mostly not solved by the "defacto" toolchains. :) It absolutely is possible to solve this. Clang is already better off than most of the other C++ toolchains in that it can handle cross-compiling with selecting cross-compile targets at runtime rather than build time. This avoids the issue in GCC where you need a toolchain built for each target triplet you wish to target, but you still run into the issue of needing libc/etc. for each target. Both CMake and vcpkg technically do support cross-compilation to some extent, but I think it rarely works without some hacking around in practice, in contrast to something like Go. If cross-compiling is a priority, the Zig toolchain offers a solution for C/C++ projects that includes both effortless cross-compiling as well as an easy to use build command. It is probably the closest to solving every (toolchain) problem C++ has, at least in theory. However, I think it doesn't really offer much for C/C++ dependencies yet. There were plans to integrate vcpkg for this I think, but I don't know where they went. If Zig integrates vcpkg deeply, I think it would become the obvious choice for modern C++ projects. I get that by not having a "standard" solution, C++ remains somewhat of a nightmare for people to get started in, and I've generally been doing very little C++ lately because of this. However I've found that there is actually a reasonable happy path in modern C++ development, and I'd definitely recommend beginners to go down that path if they want to use C++. | | |
| ▲ | palata 4 days ago | parent [-] | | > Using vcpkg [...] When you configure your CMake build, your dependencies will be fetched and built appropriately for your targets, and then CMake's find package mechanism can be used just as it is used for external dependencies. Yes! I believe this is powerful: if CMake is used properly, it does not have to know where the dependencies come from, it will just "find" them. So they could be installed on the system, or fetched by a package manager like vcpkg or conan, or just built and installed manually somewhere. > Cross-compiling with C++ is hard. Just wanted to mention the dockcross project here. I find it very useful (you just build in a docker container that has the toolchain setup for cross-compilation) and it "just works". |
|
|